00:00:00.002 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 509 00:00:00.002 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3174 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.055 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.057 The recommended git tool is: git 00:00:00.057 using credential 00000000-0000-0000-0000-000000000002 00:00:00.059 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.085 Fetching changes from the remote Git repository 00:00:00.086 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.124 Using shallow fetch with depth 1 00:00:00.124 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.124 > git --version # timeout=10 00:00:00.162 > git --version # 'git version 2.39.2' 00:00:00.162 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.197 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.197 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:05.261 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.274 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.285 Checking out Revision ea7646cba2e992b05bb6a53407de7fbcf465b5c6 (FETCH_HEAD) 00:00:05.285 > git config core.sparsecheckout # timeout=10 00:00:05.296 > git read-tree -mu HEAD # timeout=10 00:00:05.311 > git checkout -f ea7646cba2e992b05bb6a53407de7fbcf465b5c6 # timeout=5 00:00:05.328 Commit message: "ansible/inventory: Fix GP16's BMC address" 00:00:05.328 > git rev-list --no-walk ea7646cba2e992b05bb6a53407de7fbcf465b5c6 # timeout=10 00:00:05.421 [Pipeline] Start of Pipeline 00:00:05.437 [Pipeline] library 00:00:05.438 Loading library shm_lib@master 00:00:05.438 Library shm_lib@master is cached. Copying from home. 00:00:05.457 [Pipeline] node 00:00:05.469 Running on WFP39 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:05.471 [Pipeline] { 00:00:05.481 [Pipeline] catchError 00:00:05.482 [Pipeline] { 00:00:05.497 [Pipeline] wrap 00:00:05.510 [Pipeline] { 00:00:05.518 [Pipeline] stage 00:00:05.520 [Pipeline] { (Prologue) 00:00:05.700 [Pipeline] sh 00:00:06.013 + logger -p user.info -t JENKINS-CI 00:00:06.031 [Pipeline] echo 00:00:06.033 Node: WFP39 00:00:06.038 [Pipeline] sh 00:00:06.328 [Pipeline] setCustomBuildProperty 00:00:06.337 [Pipeline] echo 00:00:06.338 Cleanup processes 00:00:06.342 [Pipeline] sh 00:00:06.616 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.616 2561212 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.629 [Pipeline] sh 00:00:06.907 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.907 ++ grep -v 'sudo pgrep' 00:00:06.907 ++ awk '{print $1}' 00:00:06.907 + sudo kill -9 00:00:06.907 + true 00:00:06.918 [Pipeline] cleanWs 00:00:06.925 [WS-CLEANUP] Deleting project workspace... 00:00:06.925 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.930 [WS-CLEANUP] done 00:00:06.934 [Pipeline] setCustomBuildProperty 00:00:06.945 [Pipeline] sh 00:00:07.219 + sudo git config --global --replace-all safe.directory '*' 00:00:07.286 [Pipeline] nodesByLabel 00:00:07.287 Found a total of 2 nodes with the 'sorcerer' label 00:00:07.297 [Pipeline] httpRequest 00:00:07.301 HttpMethod: GET 00:00:07.302 URL: http://10.211.164.101/packages/jbp_ea7646cba2e992b05bb6a53407de7fbcf465b5c6.tar.gz 00:00:07.304 Sending request to url: http://10.211.164.101/packages/jbp_ea7646cba2e992b05bb6a53407de7fbcf465b5c6.tar.gz 00:00:07.330 Response Code: HTTP/1.1 200 OK 00:00:07.330 Success: Status code 200 is in the accepted range: 200,404 00:00:07.331 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_ea7646cba2e992b05bb6a53407de7fbcf465b5c6.tar.gz 00:00:25.325 [Pipeline] sh 00:00:25.608 + tar --no-same-owner -xf jbp_ea7646cba2e992b05bb6a53407de7fbcf465b5c6.tar.gz 00:00:25.627 [Pipeline] httpRequest 00:00:25.631 HttpMethod: GET 00:00:25.632 URL: http://10.211.164.101/packages/spdk_130b9406a1d197d63453b42652430be9d1b0727e.tar.gz 00:00:25.632 Sending request to url: http://10.211.164.101/packages/spdk_130b9406a1d197d63453b42652430be9d1b0727e.tar.gz 00:00:25.646 Response Code: HTTP/1.1 200 OK 00:00:25.646 Success: Status code 200 is in the accepted range: 200,404 00:00:25.647 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_130b9406a1d197d63453b42652430be9d1b0727e.tar.gz 00:00:52.392 [Pipeline] sh 00:00:52.675 + tar --no-same-owner -xf spdk_130b9406a1d197d63453b42652430be9d1b0727e.tar.gz 00:00:55.970 [Pipeline] sh 00:00:56.252 + git -C spdk log --oneline -n5 00:00:56.252 130b9406a test/nvmf: replace rpc_cmd() with direct invocation of rpc.py due to inherently larger timeout 00:00:56.252 5d3fd6726 bdev: Fix a race bug between unregistration and QoS poller 00:00:56.252 fbc673ece test/scheduler: Meassure utime of $spdk_pid threads as a fallback 00:00:56.252 3651466d0 test/scheduler: Calculate median of the cpu load samples 00:00:56.252 a7414547f test/scheduler: Make sure stderr is not O_TRUNCated in move_proc() 00:00:56.270 [Pipeline] withCredentials 00:00:56.281 > git --version # timeout=10 00:00:56.294 > git --version # 'git version 2.39.2' 00:00:56.311 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:56.313 [Pipeline] { 00:00:56.323 [Pipeline] retry 00:00:56.325 [Pipeline] { 00:00:56.343 [Pipeline] sh 00:00:56.628 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:00:56.640 [Pipeline] } 00:00:56.657 [Pipeline] // retry 00:00:56.663 [Pipeline] } 00:00:56.682 [Pipeline] // withCredentials 00:00:56.693 [Pipeline] httpRequest 00:00:56.698 HttpMethod: GET 00:00:56.699 URL: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:56.699 Sending request to url: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:56.708 Response Code: HTTP/1.1 200 OK 00:00:56.709 Success: Status code 200 is in the accepted range: 200,404 00:00:56.709 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:33.509 [Pipeline] sh 00:01:33.794 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:35.708 [Pipeline] sh 00:01:35.992 + git -C dpdk log --oneline -n5 00:01:35.992 eeb0605f11 version: 23.11.0 00:01:35.992 238778122a doc: update release notes for 23.11 00:01:35.992 46aa6b3cfc doc: fix description of RSS features 00:01:35.992 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:35.992 7e421ae345 devtools: support skipping forbid rule check 00:01:36.003 [Pipeline] } 00:01:36.019 [Pipeline] // stage 00:01:36.028 [Pipeline] stage 00:01:36.030 [Pipeline] { (Prepare) 00:01:36.046 [Pipeline] writeFile 00:01:36.058 [Pipeline] sh 00:01:36.335 + logger -p user.info -t JENKINS-CI 00:01:36.348 [Pipeline] sh 00:01:36.629 + logger -p user.info -t JENKINS-CI 00:01:36.641 [Pipeline] sh 00:01:36.958 + cat autorun-spdk.conf 00:01:36.958 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:36.958 SPDK_RUN_UBSAN=1 00:01:36.958 SPDK_TEST_FUZZER=1 00:01:36.958 SPDK_TEST_FUZZER_SHORT=1 00:01:36.958 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:36.958 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:36.964 RUN_NIGHTLY=1 00:01:36.969 [Pipeline] readFile 00:01:36.992 [Pipeline] withEnv 00:01:36.994 [Pipeline] { 00:01:37.009 [Pipeline] sh 00:01:37.293 + set -ex 00:01:37.293 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:37.293 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:37.293 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:37.293 ++ SPDK_RUN_UBSAN=1 00:01:37.293 ++ SPDK_TEST_FUZZER=1 00:01:37.293 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:37.293 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:37.293 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:37.293 ++ RUN_NIGHTLY=1 00:01:37.293 + case $SPDK_TEST_NVMF_NICS in 00:01:37.293 + DRIVERS= 00:01:37.293 + [[ -n '' ]] 00:01:37.293 + exit 0 00:01:37.302 [Pipeline] } 00:01:37.322 [Pipeline] // withEnv 00:01:37.328 [Pipeline] } 00:01:37.346 [Pipeline] // stage 00:01:37.357 [Pipeline] catchError 00:01:37.359 [Pipeline] { 00:01:37.377 [Pipeline] timeout 00:01:37.377 Timeout set to expire in 30 min 00:01:37.379 [Pipeline] { 00:01:37.394 [Pipeline] stage 00:01:37.396 [Pipeline] { (Tests) 00:01:37.408 [Pipeline] sh 00:01:37.729 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.729 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.729 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.729 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:37.729 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:37.729 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:37.729 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:37.729 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:37.729 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:37.729 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:37.729 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:37.729 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:37.729 + source /etc/os-release 00:01:37.729 ++ NAME='Fedora Linux' 00:01:37.729 ++ VERSION='38 (Cloud Edition)' 00:01:37.729 ++ ID=fedora 00:01:37.729 ++ VERSION_ID=38 00:01:37.729 ++ VERSION_CODENAME= 00:01:37.729 ++ PLATFORM_ID=platform:f38 00:01:37.729 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:37.729 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:37.729 ++ LOGO=fedora-logo-icon 00:01:37.729 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:37.729 ++ HOME_URL=https://fedoraproject.org/ 00:01:37.729 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:37.729 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:37.729 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:37.729 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:37.729 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:37.729 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:37.729 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:37.729 ++ SUPPORT_END=2024-05-14 00:01:37.729 ++ VARIANT='Cloud Edition' 00:01:37.729 ++ VARIANT_ID=cloud 00:01:37.729 + uname -a 00:01:37.729 Linux spdk-wfp-39 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:01:37.729 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:41.018 Hugepages 00:01:41.018 node hugesize free / total 00:01:41.018 node0 1048576kB 0 / 0 00:01:41.018 node0 2048kB 0 / 0 00:01:41.018 node1 1048576kB 0 / 0 00:01:41.018 node1 2048kB 0 / 0 00:01:41.018 00:01:41.018 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:41.018 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:41.018 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:41.018 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:41.018 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:41.018 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:41.277 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:41.277 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:41.277 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:41.277 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:41.277 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:41.278 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:41.278 + rm -f /tmp/spdk-ld-path 00:01:41.278 + source autorun-spdk.conf 00:01:41.278 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:41.278 ++ SPDK_RUN_UBSAN=1 00:01:41.278 ++ SPDK_TEST_FUZZER=1 00:01:41.278 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:41.278 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:41.278 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.278 ++ RUN_NIGHTLY=1 00:01:41.278 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:41.278 + [[ -n '' ]] 00:01:41.278 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:41.278 + for M in /var/spdk/build-*-manifest.txt 00:01:41.278 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:41.278 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:41.278 + for M in /var/spdk/build-*-manifest.txt 00:01:41.278 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:41.278 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:41.278 ++ uname 00:01:41.278 + [[ Linux == \L\i\n\u\x ]] 00:01:41.278 + sudo dmesg -T 00:01:41.278 + sudo dmesg --clear 00:01:41.278 + dmesg_pid=2562275 00:01:41.278 + [[ Fedora Linux == FreeBSD ]] 00:01:41.278 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:41.278 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:41.278 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:41.278 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:41.278 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:41.278 + sudo dmesg -Tw 00:01:41.278 + [[ -x /usr/src/fio-static/fio ]] 00:01:41.278 + export FIO_BIN=/usr/src/fio-static/fio 00:01:41.278 + FIO_BIN=/usr/src/fio-static/fio 00:01:41.278 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:41.278 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:41.278 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:41.278 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:41.278 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:41.278 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:41.278 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:41.278 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:41.278 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:41.278 Test configuration: 00:01:41.278 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:41.278 SPDK_RUN_UBSAN=1 00:01:41.278 SPDK_TEST_FUZZER=1 00:01:41.278 SPDK_TEST_FUZZER_SHORT=1 00:01:41.278 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:41.278 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.537 RUN_NIGHTLY=1 11:55:54 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:41.537 11:55:54 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:41.537 11:55:54 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:41.537 11:55:54 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:41.538 11:55:54 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.538 11:55:54 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.538 11:55:54 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.538 11:55:54 -- paths/export.sh@5 -- $ export PATH 00:01:41.538 11:55:54 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:41.538 11:55:54 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:41.538 11:55:54 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:41.538 11:55:54 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1718099754.XXXXXX 00:01:41.538 11:55:54 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1718099754.ja95cx 00:01:41.538 11:55:54 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.538 11:55:54 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:41.538 11:55:54 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:41.538 11:55:54 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.538 11:55:54 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:41.538 11:55:54 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:41.538 11:55:54 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:41.538 11:55:54 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:41.538 11:55:54 -- spdk/autobuild.sh@16 -- $ date -u 00:01:41.538 Tue Jun 11 09:55:54 AM UTC 2024 00:01:41.538 11:55:54 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:41.538 LTS-43-g130b9406a 00:01:41.538 11:55:54 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:41.538 11:55:54 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:41.538 11:55:54 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:41.538 11:55:54 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:41.538 11:55:54 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:41.538 11:55:54 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.538 ************************************ 00:01:41.538 START TEST ubsan 00:01:41.538 ************************************ 00:01:41.538 11:55:54 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:41.538 using ubsan 00:01:41.538 00:01:41.538 real 0m0.000s 00:01:41.538 user 0m0.000s 00:01:41.538 sys 0m0.000s 00:01:41.538 11:55:54 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:41.538 11:55:54 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.538 ************************************ 00:01:41.538 END TEST ubsan 00:01:41.538 ************************************ 00:01:41.538 11:55:54 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:41.538 11:55:54 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:41.538 11:55:54 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:41.538 11:55:54 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:41.538 11:55:54 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:41.538 11:55:54 -- common/autotest_common.sh@10 -- $ set +x 00:01:41.538 ************************************ 00:01:41.538 START TEST build_native_dpdk 00:01:41.538 ************************************ 00:01:41.538 11:55:54 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:41.538 11:55:54 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:41.538 11:55:54 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:41.538 11:55:54 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:41.538 11:55:54 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:41.538 11:55:54 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:41.538 11:55:54 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:41.538 11:55:54 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:41.538 11:55:54 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:41.538 11:55:54 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:41.538 11:55:54 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:41.538 11:55:54 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:41.538 11:55:54 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:41.538 11:55:54 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.538 11:55:54 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:41.538 11:55:54 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:41.538 11:55:54 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:41.538 11:55:54 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:41.538 eeb0605f11 version: 23.11.0 00:01:41.538 238778122a doc: update release notes for 23.11 00:01:41.538 46aa6b3cfc doc: fix description of RSS features 00:01:41.538 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:41.538 7e421ae345 devtools: support skipping forbid rule check 00:01:41.538 11:55:54 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:41.538 11:55:54 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:41.538 11:55:54 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:41.538 11:55:54 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:41.538 11:55:54 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:41.538 11:55:54 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:41.538 11:55:54 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:41.538 11:55:54 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:41.538 11:55:54 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:41.538 11:55:54 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:41.538 11:55:54 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:41.538 11:55:54 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:41.538 11:55:54 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:41.538 11:55:54 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:41.538 11:55:54 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:41.538 11:55:54 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:41.538 11:55:54 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:41.538 11:55:54 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:41.538 11:55:54 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:41.538 11:55:54 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:41.538 11:55:54 -- scripts/common.sh@343 -- $ case "$op" in 00:01:41.538 11:55:54 -- scripts/common.sh@344 -- $ : 1 00:01:41.538 11:55:54 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:41.538 11:55:54 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:41.538 11:55:54 -- scripts/common.sh@364 -- $ decimal 23 00:01:41.538 11:55:54 -- scripts/common.sh@352 -- $ local d=23 00:01:41.538 11:55:54 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:41.538 11:55:54 -- scripts/common.sh@354 -- $ echo 23 00:01:41.538 11:55:54 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:41.538 11:55:54 -- scripts/common.sh@365 -- $ decimal 21 00:01:41.538 11:55:54 -- scripts/common.sh@352 -- $ local d=21 00:01:41.538 11:55:54 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:41.538 11:55:54 -- scripts/common.sh@354 -- $ echo 21 00:01:41.538 11:55:54 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:41.538 11:55:54 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:41.538 11:55:54 -- scripts/common.sh@366 -- $ return 1 00:01:41.538 11:55:54 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:41.538 patching file config/rte_config.h 00:01:41.538 Hunk #1 succeeded at 60 (offset 1 line). 00:01:41.538 11:55:54 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:41.538 11:55:54 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:41.538 11:55:54 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:41.539 11:55:54 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:41.539 11:55:54 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:46.808 The Meson build system 00:01:46.808 Version: 1.3.1 00:01:46.808 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:46.808 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:46.808 Build type: native build 00:01:46.808 Program cat found: YES (/usr/bin/cat) 00:01:46.808 Project name: DPDK 00:01:46.808 Project version: 23.11.0 00:01:46.808 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:46.808 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:46.808 Host machine cpu family: x86_64 00:01:46.808 Host machine cpu: x86_64 00:01:46.808 Message: ## Building in Developer Mode ## 00:01:46.808 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:46.808 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:46.808 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:46.808 Program python3 found: YES (/usr/bin/python3) 00:01:46.808 Program cat found: YES (/usr/bin/cat) 00:01:46.808 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:46.808 Compiler for C supports arguments -march=native: YES 00:01:46.808 Checking for size of "void *" : 8 00:01:46.808 Checking for size of "void *" : 8 (cached) 00:01:46.808 Library m found: YES 00:01:46.808 Library numa found: YES 00:01:46.808 Has header "numaif.h" : YES 00:01:46.808 Library fdt found: NO 00:01:46.808 Library execinfo found: NO 00:01:46.808 Has header "execinfo.h" : YES 00:01:46.808 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:46.808 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:46.808 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:46.808 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:46.808 Run-time dependency openssl found: YES 3.0.9 00:01:46.808 Run-time dependency libpcap found: YES 1.10.4 00:01:46.808 Has header "pcap.h" with dependency libpcap: YES 00:01:46.808 Compiler for C supports arguments -Wcast-qual: YES 00:01:46.808 Compiler for C supports arguments -Wdeprecated: YES 00:01:46.808 Compiler for C supports arguments -Wformat: YES 00:01:46.808 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:46.808 Compiler for C supports arguments -Wformat-security: NO 00:01:46.808 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:46.808 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:46.808 Compiler for C supports arguments -Wnested-externs: YES 00:01:46.808 Compiler for C supports arguments -Wold-style-definition: YES 00:01:46.808 Compiler for C supports arguments -Wpointer-arith: YES 00:01:46.808 Compiler for C supports arguments -Wsign-compare: YES 00:01:46.808 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:46.808 Compiler for C supports arguments -Wundef: YES 00:01:46.808 Compiler for C supports arguments -Wwrite-strings: YES 00:01:46.808 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:46.808 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:46.808 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:46.808 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:46.809 Program objdump found: YES (/usr/bin/objdump) 00:01:46.809 Compiler for C supports arguments -mavx512f: YES 00:01:46.809 Checking if "AVX512 checking" compiles: YES 00:01:46.809 Fetching value of define "__SSE4_2__" : 1 00:01:46.809 Fetching value of define "__AES__" : 1 00:01:46.809 Fetching value of define "__AVX__" : 1 00:01:46.809 Fetching value of define "__AVX2__" : 1 00:01:46.809 Fetching value of define "__AVX512BW__" : 1 00:01:46.809 Fetching value of define "__AVX512CD__" : 1 00:01:46.809 Fetching value of define "__AVX512DQ__" : 1 00:01:46.809 Fetching value of define "__AVX512F__" : 1 00:01:46.809 Fetching value of define "__AVX512VL__" : 1 00:01:46.809 Fetching value of define "__PCLMUL__" : 1 00:01:46.809 Fetching value of define "__RDRND__" : 1 00:01:46.809 Fetching value of define "__RDSEED__" : 1 00:01:46.809 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:46.809 Fetching value of define "__znver1__" : (undefined) 00:01:46.809 Fetching value of define "__znver2__" : (undefined) 00:01:46.809 Fetching value of define "__znver3__" : (undefined) 00:01:46.809 Fetching value of define "__znver4__" : (undefined) 00:01:46.809 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:46.809 Message: lib/log: Defining dependency "log" 00:01:46.809 Message: lib/kvargs: Defining dependency "kvargs" 00:01:46.809 Message: lib/telemetry: Defining dependency "telemetry" 00:01:46.809 Checking for function "getentropy" : NO 00:01:46.809 Message: lib/eal: Defining dependency "eal" 00:01:46.809 Message: lib/ring: Defining dependency "ring" 00:01:46.809 Message: lib/rcu: Defining dependency "rcu" 00:01:46.809 Message: lib/mempool: Defining dependency "mempool" 00:01:46.809 Message: lib/mbuf: Defining dependency "mbuf" 00:01:46.809 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:46.809 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:46.809 Compiler for C supports arguments -mpclmul: YES 00:01:46.809 Compiler for C supports arguments -maes: YES 00:01:46.809 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:46.809 Compiler for C supports arguments -mavx512bw: YES 00:01:46.809 Compiler for C supports arguments -mavx512dq: YES 00:01:46.809 Compiler for C supports arguments -mavx512vl: YES 00:01:46.809 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:46.809 Compiler for C supports arguments -mavx2: YES 00:01:46.809 Compiler for C supports arguments -mavx: YES 00:01:46.809 Message: lib/net: Defining dependency "net" 00:01:46.809 Message: lib/meter: Defining dependency "meter" 00:01:46.809 Message: lib/ethdev: Defining dependency "ethdev" 00:01:46.809 Message: lib/pci: Defining dependency "pci" 00:01:46.809 Message: lib/cmdline: Defining dependency "cmdline" 00:01:46.809 Message: lib/metrics: Defining dependency "metrics" 00:01:46.809 Message: lib/hash: Defining dependency "hash" 00:01:46.809 Message: lib/timer: Defining dependency "timer" 00:01:46.809 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:46.809 Message: lib/acl: Defining dependency "acl" 00:01:46.809 Message: lib/bbdev: Defining dependency "bbdev" 00:01:46.809 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:46.809 Run-time dependency libelf found: YES 0.190 00:01:46.809 Message: lib/bpf: Defining dependency "bpf" 00:01:46.809 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:46.809 Message: lib/compressdev: Defining dependency "compressdev" 00:01:46.809 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:46.809 Message: lib/distributor: Defining dependency "distributor" 00:01:46.809 Message: lib/dmadev: Defining dependency "dmadev" 00:01:46.809 Message: lib/efd: Defining dependency "efd" 00:01:46.809 Message: lib/eventdev: Defining dependency "eventdev" 00:01:46.809 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:46.809 Message: lib/gpudev: Defining dependency "gpudev" 00:01:46.809 Message: lib/gro: Defining dependency "gro" 00:01:46.809 Message: lib/gso: Defining dependency "gso" 00:01:46.809 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:46.809 Message: lib/jobstats: Defining dependency "jobstats" 00:01:46.809 Message: lib/latencystats: Defining dependency "latencystats" 00:01:46.809 Message: lib/lpm: Defining dependency "lpm" 00:01:46.809 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:46.809 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:46.809 Message: lib/member: Defining dependency "member" 00:01:46.809 Message: lib/pcapng: Defining dependency "pcapng" 00:01:46.809 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:46.809 Message: lib/power: Defining dependency "power" 00:01:46.809 Message: lib/rawdev: Defining dependency "rawdev" 00:01:46.809 Message: lib/regexdev: Defining dependency "regexdev" 00:01:46.809 Message: lib/mldev: Defining dependency "mldev" 00:01:46.809 Message: lib/rib: Defining dependency "rib" 00:01:46.809 Message: lib/reorder: Defining dependency "reorder" 00:01:46.809 Message: lib/sched: Defining dependency "sched" 00:01:46.809 Message: lib/security: Defining dependency "security" 00:01:46.809 Message: lib/stack: Defining dependency "stack" 00:01:46.809 Has header "linux/userfaultfd.h" : YES 00:01:46.809 Has header "linux/vduse.h" : YES 00:01:46.809 Message: lib/vhost: Defining dependency "vhost" 00:01:46.809 Message: lib/ipsec: Defining dependency "ipsec" 00:01:46.809 Message: lib/pdcp: Defining dependency "pdcp" 00:01:46.809 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:46.809 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:46.809 Message: lib/fib: Defining dependency "fib" 00:01:46.809 Message: lib/port: Defining dependency "port" 00:01:46.809 Message: lib/pdump: Defining dependency "pdump" 00:01:46.809 Message: lib/table: Defining dependency "table" 00:01:46.809 Message: lib/pipeline: Defining dependency "pipeline" 00:01:46.809 Message: lib/graph: Defining dependency "graph" 00:01:46.809 Message: lib/node: Defining dependency "node" 00:01:46.809 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:48.186 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:48.186 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:48.186 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:48.186 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:48.186 Compiler for C supports arguments -Wno-unused-value: YES 00:01:48.186 Compiler for C supports arguments -Wno-format: YES 00:01:48.186 Compiler for C supports arguments -Wno-format-security: YES 00:01:48.186 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:48.186 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:48.186 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:48.186 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:48.186 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:48.186 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:48.186 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:48.186 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:48.186 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:48.186 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:48.186 Has header "sys/epoll.h" : YES 00:01:48.186 Program doxygen found: YES (/usr/bin/doxygen) 00:01:48.186 Configuring doxy-api-html.conf using configuration 00:01:48.186 Configuring doxy-api-man.conf using configuration 00:01:48.186 Program mandb found: YES (/usr/bin/mandb) 00:01:48.186 Program sphinx-build found: NO 00:01:48.186 Configuring rte_build_config.h using configuration 00:01:48.186 Message: 00:01:48.186 ================= 00:01:48.186 Applications Enabled 00:01:48.186 ================= 00:01:48.186 00:01:48.186 apps: 00:01:48.186 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:48.186 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:48.186 test-pmd, test-regex, test-sad, test-security-perf, 00:01:48.186 00:01:48.186 Message: 00:01:48.186 ================= 00:01:48.186 Libraries Enabled 00:01:48.186 ================= 00:01:48.186 00:01:48.186 libs: 00:01:48.186 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:48.186 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:48.186 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:48.186 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:48.186 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:48.186 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:48.186 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:48.186 00:01:48.186 00:01:48.186 Message: 00:01:48.186 =============== 00:01:48.186 Drivers Enabled 00:01:48.186 =============== 00:01:48.186 00:01:48.186 common: 00:01:48.186 00:01:48.186 bus: 00:01:48.186 pci, vdev, 00:01:48.186 mempool: 00:01:48.186 ring, 00:01:48.186 dma: 00:01:48.186 00:01:48.186 net: 00:01:48.186 i40e, 00:01:48.186 raw: 00:01:48.186 00:01:48.186 crypto: 00:01:48.186 00:01:48.186 compress: 00:01:48.186 00:01:48.186 regex: 00:01:48.186 00:01:48.186 ml: 00:01:48.186 00:01:48.186 vdpa: 00:01:48.186 00:01:48.186 event: 00:01:48.186 00:01:48.186 baseband: 00:01:48.186 00:01:48.186 gpu: 00:01:48.186 00:01:48.186 00:01:48.186 Message: 00:01:48.186 ================= 00:01:48.186 Content Skipped 00:01:48.186 ================= 00:01:48.186 00:01:48.186 apps: 00:01:48.186 00:01:48.186 libs: 00:01:48.186 00:01:48.186 drivers: 00:01:48.186 common/cpt: not in enabled drivers build config 00:01:48.186 common/dpaax: not in enabled drivers build config 00:01:48.186 common/iavf: not in enabled drivers build config 00:01:48.186 common/idpf: not in enabled drivers build config 00:01:48.186 common/mvep: not in enabled drivers build config 00:01:48.186 common/octeontx: not in enabled drivers build config 00:01:48.186 bus/auxiliary: not in enabled drivers build config 00:01:48.186 bus/cdx: not in enabled drivers build config 00:01:48.186 bus/dpaa: not in enabled drivers build config 00:01:48.186 bus/fslmc: not in enabled drivers build config 00:01:48.186 bus/ifpga: not in enabled drivers build config 00:01:48.186 bus/platform: not in enabled drivers build config 00:01:48.186 bus/vmbus: not in enabled drivers build config 00:01:48.186 common/cnxk: not in enabled drivers build config 00:01:48.186 common/mlx5: not in enabled drivers build config 00:01:48.186 common/nfp: not in enabled drivers build config 00:01:48.186 common/qat: not in enabled drivers build config 00:01:48.186 common/sfc_efx: not in enabled drivers build config 00:01:48.186 mempool/bucket: not in enabled drivers build config 00:01:48.186 mempool/cnxk: not in enabled drivers build config 00:01:48.186 mempool/dpaa: not in enabled drivers build config 00:01:48.186 mempool/dpaa2: not in enabled drivers build config 00:01:48.186 mempool/octeontx: not in enabled drivers build config 00:01:48.186 mempool/stack: not in enabled drivers build config 00:01:48.186 dma/cnxk: not in enabled drivers build config 00:01:48.186 dma/dpaa: not in enabled drivers build config 00:01:48.186 dma/dpaa2: not in enabled drivers build config 00:01:48.186 dma/hisilicon: not in enabled drivers build config 00:01:48.186 dma/idxd: not in enabled drivers build config 00:01:48.186 dma/ioat: not in enabled drivers build config 00:01:48.187 dma/skeleton: not in enabled drivers build config 00:01:48.187 net/af_packet: not in enabled drivers build config 00:01:48.187 net/af_xdp: not in enabled drivers build config 00:01:48.187 net/ark: not in enabled drivers build config 00:01:48.187 net/atlantic: not in enabled drivers build config 00:01:48.187 net/avp: not in enabled drivers build config 00:01:48.187 net/axgbe: not in enabled drivers build config 00:01:48.187 net/bnx2x: not in enabled drivers build config 00:01:48.187 net/bnxt: not in enabled drivers build config 00:01:48.187 net/bonding: not in enabled drivers build config 00:01:48.187 net/cnxk: not in enabled drivers build config 00:01:48.187 net/cpfl: not in enabled drivers build config 00:01:48.187 net/cxgbe: not in enabled drivers build config 00:01:48.187 net/dpaa: not in enabled drivers build config 00:01:48.187 net/dpaa2: not in enabled drivers build config 00:01:48.187 net/e1000: not in enabled drivers build config 00:01:48.187 net/ena: not in enabled drivers build config 00:01:48.187 net/enetc: not in enabled drivers build config 00:01:48.187 net/enetfec: not in enabled drivers build config 00:01:48.187 net/enic: not in enabled drivers build config 00:01:48.187 net/failsafe: not in enabled drivers build config 00:01:48.187 net/fm10k: not in enabled drivers build config 00:01:48.187 net/gve: not in enabled drivers build config 00:01:48.187 net/hinic: not in enabled drivers build config 00:01:48.187 net/hns3: not in enabled drivers build config 00:01:48.187 net/iavf: not in enabled drivers build config 00:01:48.187 net/ice: not in enabled drivers build config 00:01:48.187 net/idpf: not in enabled drivers build config 00:01:48.187 net/igc: not in enabled drivers build config 00:01:48.187 net/ionic: not in enabled drivers build config 00:01:48.187 net/ipn3ke: not in enabled drivers build config 00:01:48.187 net/ixgbe: not in enabled drivers build config 00:01:48.187 net/mana: not in enabled drivers build config 00:01:48.187 net/memif: not in enabled drivers build config 00:01:48.187 net/mlx4: not in enabled drivers build config 00:01:48.187 net/mlx5: not in enabled drivers build config 00:01:48.187 net/mvneta: not in enabled drivers build config 00:01:48.187 net/mvpp2: not in enabled drivers build config 00:01:48.187 net/netvsc: not in enabled drivers build config 00:01:48.187 net/nfb: not in enabled drivers build config 00:01:48.187 net/nfp: not in enabled drivers build config 00:01:48.187 net/ngbe: not in enabled drivers build config 00:01:48.187 net/null: not in enabled drivers build config 00:01:48.187 net/octeontx: not in enabled drivers build config 00:01:48.187 net/octeon_ep: not in enabled drivers build config 00:01:48.187 net/pcap: not in enabled drivers build config 00:01:48.187 net/pfe: not in enabled drivers build config 00:01:48.187 net/qede: not in enabled drivers build config 00:01:48.187 net/ring: not in enabled drivers build config 00:01:48.187 net/sfc: not in enabled drivers build config 00:01:48.187 net/softnic: not in enabled drivers build config 00:01:48.187 net/tap: not in enabled drivers build config 00:01:48.187 net/thunderx: not in enabled drivers build config 00:01:48.187 net/txgbe: not in enabled drivers build config 00:01:48.187 net/vdev_netvsc: not in enabled drivers build config 00:01:48.187 net/vhost: not in enabled drivers build config 00:01:48.187 net/virtio: not in enabled drivers build config 00:01:48.187 net/vmxnet3: not in enabled drivers build config 00:01:48.187 raw/cnxk_bphy: not in enabled drivers build config 00:01:48.187 raw/cnxk_gpio: not in enabled drivers build config 00:01:48.187 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:48.187 raw/ifpga: not in enabled drivers build config 00:01:48.187 raw/ntb: not in enabled drivers build config 00:01:48.187 raw/skeleton: not in enabled drivers build config 00:01:48.187 crypto/armv8: not in enabled drivers build config 00:01:48.187 crypto/bcmfs: not in enabled drivers build config 00:01:48.187 crypto/caam_jr: not in enabled drivers build config 00:01:48.187 crypto/ccp: not in enabled drivers build config 00:01:48.187 crypto/cnxk: not in enabled drivers build config 00:01:48.187 crypto/dpaa_sec: not in enabled drivers build config 00:01:48.187 crypto/dpaa2_sec: not in enabled drivers build config 00:01:48.187 crypto/ipsec_mb: not in enabled drivers build config 00:01:48.187 crypto/mlx5: not in enabled drivers build config 00:01:48.187 crypto/mvsam: not in enabled drivers build config 00:01:48.187 crypto/nitrox: not in enabled drivers build config 00:01:48.187 crypto/null: not in enabled drivers build config 00:01:48.187 crypto/octeontx: not in enabled drivers build config 00:01:48.187 crypto/openssl: not in enabled drivers build config 00:01:48.187 crypto/scheduler: not in enabled drivers build config 00:01:48.187 crypto/uadk: not in enabled drivers build config 00:01:48.187 crypto/virtio: not in enabled drivers build config 00:01:48.187 compress/isal: not in enabled drivers build config 00:01:48.187 compress/mlx5: not in enabled drivers build config 00:01:48.187 compress/octeontx: not in enabled drivers build config 00:01:48.187 compress/zlib: not in enabled drivers build config 00:01:48.187 regex/mlx5: not in enabled drivers build config 00:01:48.187 regex/cn9k: not in enabled drivers build config 00:01:48.187 ml/cnxk: not in enabled drivers build config 00:01:48.187 vdpa/ifc: not in enabled drivers build config 00:01:48.187 vdpa/mlx5: not in enabled drivers build config 00:01:48.187 vdpa/nfp: not in enabled drivers build config 00:01:48.187 vdpa/sfc: not in enabled drivers build config 00:01:48.187 event/cnxk: not in enabled drivers build config 00:01:48.187 event/dlb2: not in enabled drivers build config 00:01:48.187 event/dpaa: not in enabled drivers build config 00:01:48.187 event/dpaa2: not in enabled drivers build config 00:01:48.187 event/dsw: not in enabled drivers build config 00:01:48.187 event/opdl: not in enabled drivers build config 00:01:48.187 event/skeleton: not in enabled drivers build config 00:01:48.187 event/sw: not in enabled drivers build config 00:01:48.187 event/octeontx: not in enabled drivers build config 00:01:48.187 baseband/acc: not in enabled drivers build config 00:01:48.187 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:48.187 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:48.187 baseband/la12xx: not in enabled drivers build config 00:01:48.187 baseband/null: not in enabled drivers build config 00:01:48.187 baseband/turbo_sw: not in enabled drivers build config 00:01:48.187 gpu/cuda: not in enabled drivers build config 00:01:48.187 00:01:48.187 00:01:48.187 Build targets in project: 217 00:01:48.187 00:01:48.187 DPDK 23.11.0 00:01:48.187 00:01:48.187 User defined options 00:01:48.187 libdir : lib 00:01:48.187 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.187 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:48.187 c_link_args : 00:01:48.187 enable_docs : false 00:01:48.187 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:48.187 enable_kmods : false 00:01:48.187 machine : native 00:01:48.187 tests : false 00:01:48.187 00:01:48.187 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:48.187 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:48.187 11:56:01 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 00:01:48.187 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:48.447 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:48.447 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:48.447 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:48.447 [4/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:48.447 [5/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:48.447 [6/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:48.447 [7/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:48.447 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:48.447 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:48.447 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:48.447 [11/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:48.447 [12/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:48.447 [13/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:48.447 [14/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:48.447 [15/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:48.447 [16/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:48.447 [17/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:48.447 [18/707] Linking static target lib/librte_kvargs.a 00:01:48.447 [19/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:48.709 [20/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:48.709 [21/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:48.709 [22/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:48.709 [23/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:48.709 [24/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:48.709 [25/707] Linking static target lib/librte_log.a 00:01:48.977 [26/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.977 [27/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:48.977 [28/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:48.977 [29/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:48.977 [30/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:48.977 [31/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:48.977 [32/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:48.977 [33/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:48.977 [34/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:48.977 [35/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:49.234 [36/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:49.234 [37/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:49.234 [38/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:49.234 [39/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:49.234 [40/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:49.234 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:49.234 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:49.234 [43/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:49.234 [44/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:49.234 [45/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:49.234 [46/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:49.234 [47/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:49.234 [48/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:49.234 [49/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:49.234 [50/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:49.234 [51/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:49.234 [52/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:49.234 [53/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:49.234 [54/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:49.234 [55/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:49.234 [56/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:49.234 [57/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:49.234 [58/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:49.234 [59/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:49.234 [60/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:49.234 [61/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:49.234 [62/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:49.234 [63/707] Linking static target lib/librte_ring.a 00:01:49.234 [64/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:49.234 [65/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:49.234 [66/707] Linking static target lib/librte_pci.a 00:01:49.234 [67/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:49.234 [68/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:49.234 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:49.234 [70/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:49.234 [71/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:49.234 [72/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:49.234 [73/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:49.234 [74/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:49.234 [75/707] Linking static target lib/librte_meter.a 00:01:49.492 [76/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:49.492 [77/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:49.492 [78/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:49.492 [79/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:49.492 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:49.492 [81/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:49.492 [82/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:49.492 [83/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:49.492 [84/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:49.492 [85/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:49.492 [86/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:49.492 [87/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:49.492 [88/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.492 [89/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:49.492 [90/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:49.492 [91/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:49.492 [92/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:49.492 [93/707] Linking static target lib/librte_net.a 00:01:49.492 [94/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:49.492 [95/707] Linking target lib/librte_log.so.24.0 00:01:49.492 [96/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:49.756 [97/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:49.756 [98/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:49.756 [99/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:49.756 [100/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.756 [101/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:49.756 [102/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:49.756 [103/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.756 [104/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:49.756 [105/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.756 [106/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:49.756 [107/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:49.756 [108/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:49.756 [109/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:49.756 [110/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:49.756 [111/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:49.756 [112/707] Linking target lib/librte_kvargs.so.24.0 00:01:49.756 [113/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:50.017 [114/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:50.017 [115/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:50.017 [116/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:50.017 [117/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:50.017 [118/707] Linking static target lib/librte_cmdline.a 00:01:50.017 [119/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:50.017 [120/707] Linking static target lib/librte_cfgfile.a 00:01:50.017 [121/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.017 [122/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:50.017 [123/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:50.017 [124/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:50.017 [125/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:50.017 [126/707] Linking static target lib/librte_mempool.a 00:01:50.017 [127/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:50.017 [128/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:50.017 [129/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:50.017 [130/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:50.017 [131/707] Linking static target lib/librte_metrics.a 00:01:50.017 [132/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:50.017 [133/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:50.017 [134/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:50.017 [135/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:50.277 [136/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:50.277 [137/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:50.277 [138/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:50.277 [139/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:50.277 [140/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:50.277 [141/707] Linking static target lib/librte_bitratestats.a 00:01:50.277 [142/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:50.277 [143/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:50.277 [144/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:50.277 [145/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:50.277 [146/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:50.277 [147/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:50.539 [148/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:50.539 [149/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:50.539 [150/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:50.539 [151/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:50.540 [152/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:50.540 [153/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:50.540 [154/707] Linking static target lib/librte_compressdev.a 00:01:50.540 [155/707] Linking static target lib/librte_telemetry.a 00:01:50.540 [156/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:50.540 [157/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.540 [158/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:50.540 [159/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.540 [160/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:50.540 [161/707] Linking static target lib/librte_eal.a 00:01:50.540 [162/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:50.540 [163/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:50.540 [164/707] Linking static target lib/librte_timer.a 00:01:50.540 [165/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:50.540 [166/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:50.540 [167/707] Linking static target lib/librte_rcu.a 00:01:50.798 [168/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:50.798 [169/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:50.798 [170/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:50.798 [171/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:50.798 [172/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:50.798 [173/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:50.798 [174/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:50.798 [175/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:50.798 [176/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:50.798 [177/707] Linking static target lib/librte_distributor.a 00:01:50.798 [178/707] Linking static target lib/librte_bbdev.a 00:01:50.798 [179/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:50.798 [180/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:50.798 [181/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:50.798 [182/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:50.798 [183/707] Linking static target lib/librte_mbuf.a 00:01:50.798 [184/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:50.798 [185/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:50.798 [186/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:51.062 [187/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:51.063 [188/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:51.063 [189/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:51.063 [190/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:51.063 [191/707] Linking static target lib/librte_dispatcher.a 00:01:51.063 [192/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:51.063 [193/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:51.063 [194/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:51.063 [195/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:51.063 [196/707] Linking static target lib/librte_jobstats.a 00:01:51.063 [197/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:51.063 [198/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.063 [199/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:51.063 [200/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:51.063 [201/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:51.063 [202/707] Linking static target lib/librte_dmadev.a 00:01:51.063 [203/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:51.063 [204/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.063 [205/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:51.063 [206/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.063 [207/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:51.063 [208/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:51.063 [209/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.063 [210/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.327 [211/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:51.327 [212/707] Linking static target lib/librte_gpudev.a 00:01:51.327 [213/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:51.327 [214/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:51.327 [215/707] Linking static target lib/librte_bpf.a 00:01:51.327 [216/707] Linking static target lib/librte_gro.a 00:01:51.327 [217/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:51.327 [218/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:51.327 [219/707] Linking target lib/librte_telemetry.so.24.0 00:01:51.327 [220/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.327 [221/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:51.327 [222/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.327 [223/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:51.327 [224/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:51.327 [225/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:51.327 [226/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:51.327 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:51.327 [228/707] Linking static target lib/librte_gso.a 00:01:51.327 [229/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:51.327 [230/707] Linking static target lib/librte_latencystats.a 00:01:51.327 [231/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:51.327 [232/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:51.587 [233/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:51.587 [234/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:51.587 [235/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:51.587 [236/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:51.587 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:51.587 [238/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:51.587 [239/707] Linking static target lib/librte_ip_frag.a 00:01:51.587 [240/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.587 [241/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:51.587 [242/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:51.587 [243/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.587 [244/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:51.587 [245/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:51.587 [246/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.587 [247/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.587 [248/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.587 [249/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:51.854 [250/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.854 [251/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:51.854 [252/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.854 [253/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:51.854 [254/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.854 [255/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:51.854 [256/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:51.854 [257/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.854 [258/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:51.854 [259/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:51.854 [260/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:51.854 [261/707] Linking static target lib/librte_regexdev.a 00:01:51.854 [262/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:51.854 [263/707] Linking static target lib/librte_stack.a 00:01:51.854 [264/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:51.854 [265/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:51.854 [266/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:51.854 [267/707] Linking static target lib/librte_rawdev.a 00:01:51.854 [268/707] Linking static target lib/librte_mldev.a 00:01:51.854 [269/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:51.854 [270/707] Linking static target lib/librte_pcapng.a 00:01:51.854 [271/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:52.112 [272/707] Linking static target lib/librte_power.a 00:01:52.112 [273/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:52.112 [274/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:52.112 [275/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.112 [276/707] Linking static target lib/librte_security.a 00:01:52.112 [277/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:52.112 [278/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:52.112 [279/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:52.112 [280/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:52.112 [281/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.112 [282/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:52.112 [283/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:52.112 [284/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:52.112 [285/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:52.112 [286/707] Linking static target lib/librte_reorder.a 00:01:52.112 [287/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:52.112 [288/707] Linking static target lib/librte_efd.a 00:01:52.112 [289/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:52.372 [290/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:52.372 [291/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:52.372 [292/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:52.372 [293/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:52.372 [294/707] Linking static target lib/librte_lpm.a 00:01:52.372 [295/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:52.372 [296/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:52.372 [297/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.372 [298/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:52.372 [299/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:52.372 [300/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:52.372 [301/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:52.372 [302/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:52.373 [303/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:52.373 [304/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:52.632 [305/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:52.632 [306/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:52.632 [307/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.632 [308/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:52.632 [309/707] Linking static target lib/librte_rib.a 00:01:52.632 [310/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.632 [311/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:52.632 [312/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.632 [313/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.632 [314/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:52.632 [315/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.897 [316/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:52.897 [317/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:52.897 [318/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.897 [319/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:52.897 [320/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:52.897 [321/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:52.897 [322/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:52.897 [323/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:52.897 [324/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:53.155 [325/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.155 [326/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:53.155 [327/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:53.155 [328/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:53.155 [329/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:53.155 [330/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:53.155 [331/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:53.155 [332/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:53.155 [333/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:53.155 [334/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:53.155 [335/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:53.155 [336/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:53.155 [337/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:53.155 [338/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:53.155 [339/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:53.155 [340/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:53.155 [341/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:53.155 [342/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:53.155 [343/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:53.415 [344/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:53.415 [345/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:53.415 [346/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:53.415 [347/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.415 [348/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:53.415 [349/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:53.415 [350/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:53.415 [351/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:53.415 [352/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:53.415 [353/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:53.679 [354/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:53.679 [355/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:53.679 [356/707] Linking static target lib/librte_cryptodev.a 00:01:53.679 [357/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:53.679 [358/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:53.679 [359/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:53.679 [360/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:53.679 [361/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:53.679 [362/707] Linking static target lib/librte_fib.a 00:01:53.679 [363/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:53.679 [364/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:53.679 [365/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:53.679 [366/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:53.679 [367/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:53.943 [368/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:53.943 [369/707] Linking static target lib/librte_pdump.a 00:01:53.943 [370/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:53.943 [371/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:53.943 [372/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:53.943 [373/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:53.943 [374/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:53.943 [375/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:53.943 [376/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:53.943 [377/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:54.203 [378/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.203 [379/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:54.203 [380/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:54.203 [381/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:54.203 [382/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:54.203 [383/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:54.203 [384/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:54.203 [385/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:54.203 [386/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:54.203 [387/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:54.203 [388/707] Linking static target lib/librte_sched.a 00:01:54.203 [389/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:54.203 [390/707] Linking static target lib/librte_graph.a 00:01:54.203 [391/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.203 [392/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.203 [393/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:54.203 [394/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:54.203 [395/707] Linking static target lib/librte_table.a 00:01:54.462 [396/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:54.462 [397/707] Linking static target lib/acl/libavx2_tmp.a 00:01:54.462 [398/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:54.462 [399/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:54.462 [400/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:54.462 [401/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:54.462 [402/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:54.462 [403/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:54.462 [404/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:54.462 [405/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:54.462 [406/707] Linking static target lib/librte_member.a 00:01:54.462 [407/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:54.462 [408/707] Linking static target lib/librte_hash.a 00:01:54.462 [409/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:54.462 [410/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:54.462 [411/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:54.462 [412/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:54.723 [413/707] Linking static target lib/librte_ipsec.a 00:01:54.723 [414/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:54.723 [415/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:54.723 [416/707] Linking static target drivers/librte_bus_vdev.a 00:01:54.723 [417/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:54.723 [418/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:54.723 [419/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:54.723 [420/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:54.723 [421/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:54.723 [422/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:54.723 [423/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:54.723 [424/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:54.723 [425/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:54.723 [426/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:54.723 [427/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:54.723 [428/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:54.723 [429/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.723 [430/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:54.987 [431/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:54.987 [432/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:54.987 [433/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:54.987 [434/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:54.987 [435/707] Linking static target lib/librte_eventdev.a 00:01:54.987 [436/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:54.987 [437/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:54.987 [438/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:54.987 [439/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:54.987 [440/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:54.987 [441/707] Linking static target drivers/librte_bus_pci.a 00:01:54.987 [442/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:54.987 [443/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:54.987 [444/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:54.987 [445/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.988 [446/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:54.988 [447/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:54.988 [448/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:54.988 [449/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:54.988 [450/707] Linking static target lib/librte_pdcp.a 00:01:54.988 [451/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.988 [452/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:55.251 [453/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:55.251 [454/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.251 [455/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:55.251 [456/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:55.251 [457/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:55.251 [458/707] Linking static target lib/librte_acl.a 00:01:55.251 [459/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.251 [460/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:55.251 [461/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:55.251 [462/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:55.513 [463/707] Linking static target lib/librte_node.a 00:01:55.513 [464/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:55.513 [465/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:55.513 [466/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.513 [467/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:55.513 [468/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:55.513 [469/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:55.513 [470/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.513 [471/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:55.513 [472/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:55.513 [473/707] Linking static target lib/librte_port.a 00:01:55.513 [474/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:55.513 [475/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:55.513 [476/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:55.513 [477/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:55.513 [478/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.513 [479/707] Linking static target drivers/librte_mempool_ring.a 00:01:55.513 [480/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:55.781 [481/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:55.781 [482/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.781 [483/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:55.781 [484/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:55.781 [485/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.781 [486/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:55.781 [487/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:55.781 [488/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.781 [489/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:55.781 [490/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:55.781 [491/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:56.045 [492/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:56.045 [493/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.045 [494/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:56.045 [495/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:56.046 [496/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:56.046 [497/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:56.046 [498/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:56.046 [499/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:56.046 [500/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:56.046 [501/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:56.046 [502/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:56.046 [503/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:56.046 [504/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:56.046 [505/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:56.046 [506/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:56.046 [507/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:56.303 [508/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:56.303 [509/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:56.303 [510/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:56.303 [511/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:56.303 [512/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:56.303 [513/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:56.303 [514/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:56.303 [515/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:56.303 [516/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:56.303 [517/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:56.303 [518/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:56.303 [519/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:56.303 [520/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:56.303 [521/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.303 [522/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:56.561 [523/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:56.561 [524/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:56.561 [525/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:56.561 [526/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:56.561 [527/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:56.561 [528/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:56.561 [529/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:56.561 [530/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:56.561 [531/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:56.561 [532/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:56.561 [533/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:56.561 [534/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:56.561 [535/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:56.561 [536/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:56.561 [537/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:56.820 [538/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:56.820 [539/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:56.820 [540/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:56.820 [541/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:56.820 [542/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:56.820 [543/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:56.820 [544/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:56.820 [545/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:56.820 [546/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:56.820 [547/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:56.820 [548/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:56.820 [549/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:56.820 [550/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:56.820 [551/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:57.079 [552/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:57.079 [553/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:57.079 [554/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:57.079 [555/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:57.079 [556/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:57.079 [557/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:57.079 [558/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:57.079 [559/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:57.079 [560/707] Linking static target lib/librte_ethdev.a 00:01:57.079 [561/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:57.338 [562/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:57.338 [563/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:57.338 [564/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:57.338 [565/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:57.338 [566/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:57.338 [567/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:57.338 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:57.338 [569/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:57.338 [570/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:57.905 [571/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:57.905 [572/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:57.905 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:58.164 [574/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:01:58.164 [575/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:58.424 [576/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.424 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:58.683 [578/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:58.683 [579/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:58.683 [580/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:58.942 [581/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:59.201 [582/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:59.201 [583/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:59.201 [584/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:59.201 [585/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:59.201 [586/707] Linking static target drivers/librte_net_i40e.a 00:01:59.201 [587/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:59.459 [588/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:00.396 [589/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.655 [590/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:01.592 [591/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:02.161 [592/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.420 [593/707] Linking target lib/librte_eal.so.24.0 00:02:02.420 [594/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:02.420 [595/707] Linking target lib/librte_cfgfile.so.24.0 00:02:02.680 [596/707] Linking target lib/librte_pci.so.24.0 00:02:02.680 [597/707] Linking target lib/librte_ring.so.24.0 00:02:02.680 [598/707] Linking target lib/librte_timer.so.24.0 00:02:02.680 [599/707] Linking target lib/librte_dmadev.so.24.0 00:02:02.680 [600/707] Linking target lib/librte_meter.so.24.0 00:02:02.680 [601/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:02.680 [602/707] Linking target lib/librte_jobstats.so.24.0 00:02:02.680 [603/707] Linking target lib/librte_stack.so.24.0 00:02:02.680 [604/707] Linking target lib/librte_rawdev.so.24.0 00:02:02.680 [605/707] Linking target lib/librte_acl.so.24.0 00:02:02.680 [606/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:02.680 [607/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:02.680 [608/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:02.680 [609/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:02.680 [610/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:02.680 [611/707] Linking target lib/librte_rcu.so.24.0 00:02:02.680 [612/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:02.680 [613/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:02.680 [614/707] Linking target lib/librte_mempool.so.24.0 00:02:02.680 [615/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:02.680 [616/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:02.939 [617/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:02.939 [618/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:02.939 [619/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:02.939 [620/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:02.939 [621/707] Linking target lib/librte_rib.so.24.0 00:02:02.939 [622/707] Linking target lib/librte_mbuf.so.24.0 00:02:03.198 [623/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:03.198 [624/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:03.198 [625/707] Linking target lib/librte_bbdev.so.24.0 00:02:03.198 [626/707] Linking target lib/librte_fib.so.24.0 00:02:03.198 [627/707] Linking target lib/librte_compressdev.so.24.0 00:02:03.198 [628/707] Linking target lib/librte_cryptodev.so.24.0 00:02:03.198 [629/707] Linking target lib/librte_reorder.so.24.0 00:02:03.198 [630/707] Linking target lib/librte_gpudev.so.24.0 00:02:03.198 [631/707] Linking target lib/librte_distributor.so.24.0 00:02:03.198 [632/707] Linking target lib/librte_net.so.24.0 00:02:03.198 [633/707] Linking target lib/librte_sched.so.24.0 00:02:03.198 [634/707] Linking target lib/librte_regexdev.so.24.0 00:02:03.198 [635/707] Linking target lib/librte_mldev.so.24.0 00:02:03.457 [636/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:03.457 [637/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:03.457 [638/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:03.457 [639/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:03.457 [640/707] Linking target lib/librte_cmdline.so.24.0 00:02:03.457 [641/707] Linking target lib/librte_security.so.24.0 00:02:03.457 [642/707] Linking target lib/librte_hash.so.24.0 00:02:03.457 [643/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:03.457 [644/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:03.715 [645/707] Linking target lib/librte_lpm.so.24.0 00:02:03.715 [646/707] Linking target lib/librte_pdcp.so.24.0 00:02:03.715 [647/707] Linking target lib/librte_efd.so.24.0 00:02:03.715 [648/707] Linking target lib/librte_ipsec.so.24.0 00:02:03.715 [649/707] Linking target lib/librte_member.so.24.0 00:02:03.715 [650/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:03.715 [651/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:05.621 [652/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.881 [653/707] Linking target lib/librte_ethdev.so.24.0 00:02:05.881 [654/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:05.881 [655/707] Linking target lib/librte_ip_frag.so.24.0 00:02:05.881 [656/707] Linking target lib/librte_metrics.so.24.0 00:02:06.139 [657/707] Linking target lib/librte_pcapng.so.24.0 00:02:06.139 [658/707] Linking target lib/librte_gso.so.24.0 00:02:06.139 [659/707] Linking target lib/librte_gro.so.24.0 00:02:06.139 [660/707] Linking target lib/librte_bpf.so.24.0 00:02:06.139 [661/707] Linking target lib/librte_power.so.24.0 00:02:06.139 [662/707] Linking target lib/librte_eventdev.so.24.0 00:02:06.139 [663/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:06.139 [664/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:06.139 [665/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:06.139 [666/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:06.139 [667/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:06.139 [668/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:06.140 [669/707] Linking target lib/librte_bitratestats.so.24.0 00:02:06.140 [670/707] Linking target lib/librte_latencystats.so.24.0 00:02:06.140 [671/707] Linking target lib/librte_graph.so.24.0 00:02:06.140 [672/707] Linking target lib/librte_pdump.so.24.0 00:02:06.399 [673/707] Linking target lib/librte_dispatcher.so.24.0 00:02:06.399 [674/707] Linking target lib/librte_port.so.24.0 00:02:06.399 [675/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:06.399 [676/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:06.399 [677/707] Linking target lib/librte_node.so.24.0 00:02:06.399 [678/707] Linking target lib/librte_table.so.24.0 00:02:06.658 [679/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:10.914 [680/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:10.914 [681/707] Linking static target lib/librte_pipeline.a 00:02:11.173 [682/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:11.173 [683/707] Linking static target lib/librte_vhost.a 00:02:11.739 [684/707] Linking target app/dpdk-test-cmdline 00:02:11.739 [685/707] Linking target app/dpdk-test-crypto-perf 00:02:11.739 [686/707] Linking target app/dpdk-test-bbdev 00:02:11.739 [687/707] Linking target app/dpdk-testpmd 00:02:11.739 [688/707] Linking target app/dpdk-dumpcap 00:02:11.739 [689/707] Linking target app/dpdk-test-gpudev 00:02:11.739 [690/707] Linking target app/dpdk-test-regex 00:02:11.739 [691/707] Linking target app/dpdk-graph 00:02:11.739 [692/707] Linking target app/dpdk-pdump 00:02:11.739 [693/707] Linking target app/dpdk-test-sad 00:02:11.739 [694/707] Linking target app/dpdk-test-mldev 00:02:11.739 [695/707] Linking target app/dpdk-test-compress-perf 00:02:11.739 [696/707] Linking target app/dpdk-test-eventdev 00:02:11.997 [697/707] Linking target app/dpdk-test-dma-perf 00:02:11.997 [698/707] Linking target app/dpdk-test-fib 00:02:11.997 [699/707] Linking target app/dpdk-proc-info 00:02:11.997 [700/707] Linking target app/dpdk-test-flow-perf 00:02:11.997 [701/707] Linking target app/dpdk-test-acl 00:02:11.997 [702/707] Linking target app/dpdk-test-pipeline 00:02:11.997 [703/707] Linking target app/dpdk-test-security-perf 00:02:13.374 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.374 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:16.668 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.668 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:16.668 11:56:29 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 install 00:02:16.668 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:16.668 [0/1] Installing files. 00:02:16.668 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:16.668 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.668 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.668 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.669 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.670 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.671 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.672 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.673 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:16.674 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:16.674 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.674 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.675 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.934 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:16.935 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:16.935 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:16.935 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.935 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:16.935 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.935 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.936 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.198 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.199 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.200 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:17.201 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:17.201 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:17.201 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:17.201 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:17.201 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:17.201 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:17.201 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:17.201 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:17.201 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:17.201 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:17.201 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:17.201 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:17.201 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:17.201 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:17.201 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:17.201 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:17.201 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:17.201 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:17.201 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:17.201 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:17.201 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:17.201 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:17.201 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:17.201 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:17.201 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:17.201 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:17.201 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:17.201 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:17.201 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:17.201 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:17.201 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:17.201 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:17.201 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:17.202 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:17.202 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:17.202 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:17.202 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:17.202 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:17.202 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:17.202 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:17.202 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:17.202 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:17.202 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:17.202 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:17.202 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:17.202 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:17.202 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:17.202 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:17.202 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:17.202 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:17.202 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:17.202 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:17.202 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:17.202 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:17.202 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:17.202 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:17.202 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:17.202 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:17.202 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:17.202 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:17.202 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:17.202 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:17.202 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:17.202 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:17.202 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:17.202 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:17.202 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:17.202 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:17.202 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:17.202 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:17.202 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:17.202 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:17.202 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:17.202 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:17.202 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:17.202 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:17.202 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:17.202 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:17.202 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:17.202 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:17.202 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:17.202 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:17.202 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:17.202 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:17.202 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:17.202 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:17.202 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:17.202 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:17.202 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:17.202 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:17.202 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:17.202 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:17.202 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:17.202 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:17.202 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:17.202 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:17.202 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:17.202 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:17.202 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:17.202 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:17.202 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:17.202 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:17.202 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:17.202 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:17.202 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:17.203 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:17.203 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:17.203 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:17.203 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:17.203 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:17.203 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:17.203 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:17.203 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:17.203 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:17.203 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:17.203 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:17.203 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:17.203 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:17.203 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:17.203 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:17.203 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:17.203 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:17.203 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:17.203 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:17.203 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:17.203 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:17.203 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:17.203 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:17.203 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:17.203 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:17.203 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:17.203 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:17.203 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:17.203 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:17.203 11:56:30 -- common/autobuild_common.sh@189 -- $ uname -s 00:02:17.203 11:56:30 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:17.203 11:56:30 -- common/autobuild_common.sh@200 -- $ cat 00:02:17.203 11:56:30 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:17.203 00:02:17.203 real 0m35.640s 00:02:17.203 user 9m53.077s 00:02:17.203 sys 2m9.948s 00:02:17.203 11:56:30 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:17.203 11:56:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:17.203 ************************************ 00:02:17.203 END TEST build_native_dpdk 00:02:17.203 ************************************ 00:02:17.203 11:56:30 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:17.203 11:56:30 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:17.203 11:56:30 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:17.203 11:56:30 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:17.203 11:56:30 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:17.203 11:56:30 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:17.203 11:56:30 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:17.203 11:56:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:17.203 ************************************ 00:02:17.203 START TEST autobuild_llvm_precompile 00:02:17.203 ************************************ 00:02:17.203 11:56:30 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:02:17.203 11:56:30 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:17.203 11:56:30 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:02:17.203 Target: x86_64-redhat-linux-gnu 00:02:17.203 Thread model: posix 00:02:17.203 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:17.203 11:56:30 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:02:17.203 11:56:30 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:02:17.203 11:56:30 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:02:17.203 11:56:30 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:02:17.203 11:56:30 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:02:17.203 11:56:30 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:02:17.203 11:56:30 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:17.203 11:56:30 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:02:17.203 11:56:30 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:02:17.203 11:56:30 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:17.462 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:17.719 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.719 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.719 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:18.284 Using 'verbs' RDMA provider 00:02:34.106 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:49.032 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:49.032 Creating mk/config.mk...done. 00:02:49.032 Creating mk/cc.flags.mk...done. 00:02:49.032 Type 'make' to build. 00:02:49.032 00:02:49.032 real 0m31.874s 00:02:49.032 user 0m14.667s 00:02:49.032 sys 0m16.523s 00:02:49.032 11:57:02 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:49.032 11:57:02 -- common/autotest_common.sh@10 -- $ set +x 00:02:49.032 ************************************ 00:02:49.032 END TEST autobuild_llvm_precompile 00:02:49.032 ************************************ 00:02:49.032 11:57:02 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:49.032 11:57:02 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:49.032 11:57:02 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:49.032 11:57:02 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:49.032 11:57:02 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:49.290 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:49.547 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:49.547 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:49.805 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:50.064 Using 'verbs' RDMA provider 00:03:05.893 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:03:18.181 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:03:18.181 Creating mk/config.mk...done. 00:03:18.181 Creating mk/cc.flags.mk...done. 00:03:18.181 Type 'make' to build. 00:03:18.181 11:57:29 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:03:18.181 11:57:29 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:03:18.181 11:57:29 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:03:18.181 11:57:29 -- common/autotest_common.sh@10 -- $ set +x 00:03:18.181 ************************************ 00:03:18.181 START TEST make 00:03:18.181 ************************************ 00:03:18.181 11:57:29 -- common/autotest_common.sh@1104 -- $ make -j72 00:03:18.181 make[1]: Nothing to be done for 'all'. 00:03:19.123 The Meson build system 00:03:19.123 Version: 1.3.1 00:03:19.123 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:19.123 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:19.123 Build type: native build 00:03:19.123 Project name: libvfio-user 00:03:19.123 Project version: 0.0.1 00:03:19.123 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:03:19.123 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:03:19.123 Host machine cpu family: x86_64 00:03:19.123 Host machine cpu: x86_64 00:03:19.123 Run-time dependency threads found: YES 00:03:19.123 Library dl found: YES 00:03:19.123 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:19.123 Run-time dependency json-c found: YES 0.17 00:03:19.123 Run-time dependency cmocka found: YES 1.1.7 00:03:19.123 Program pytest-3 found: NO 00:03:19.123 Program flake8 found: NO 00:03:19.123 Program misspell-fixer found: NO 00:03:19.123 Program restructuredtext-lint found: NO 00:03:19.123 Program valgrind found: YES (/usr/bin/valgrind) 00:03:19.123 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:19.123 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:19.123 Compiler for C supports arguments -Wwrite-strings: YES 00:03:19.123 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:19.123 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:19.123 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:19.123 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:19.123 Build targets in project: 8 00:03:19.123 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:19.123 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:19.123 00:03:19.123 libvfio-user 0.0.1 00:03:19.123 00:03:19.123 User defined options 00:03:19.123 buildtype : debug 00:03:19.123 default_library: static 00:03:19.123 libdir : /usr/local/lib 00:03:19.123 00:03:19.123 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:19.686 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:19.686 [1/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:19.686 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:19.686 [3/36] Compiling C object samples/null.p/null.c.o 00:03:19.686 [4/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:19.686 [5/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:19.686 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:19.686 [7/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:19.686 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:19.686 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:19.686 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:19.686 [11/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:19.686 [12/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:19.687 [13/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:19.687 [14/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:19.687 [15/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:19.687 [16/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:19.687 [17/36] Compiling C object samples/server.p/server.c.o 00:03:19.687 [18/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:19.687 [19/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:19.687 [20/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:19.687 [21/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:19.687 [22/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:19.687 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:19.687 [24/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:19.687 [25/36] Compiling C object samples/client.p/client.c.o 00:03:19.687 [26/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:19.687 [27/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:19.687 [28/36] Linking static target lib/libvfio-user.a 00:03:19.687 [29/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:19.687 [30/36] Linking target samples/client 00:03:19.687 [31/36] Linking target samples/shadow_ioeventfd_server 00:03:19.944 [32/36] Linking target samples/lspci 00:03:19.944 [33/36] Linking target samples/null 00:03:19.944 [34/36] Linking target samples/gpio-pci-idio-16 00:03:19.944 [35/36] Linking target samples/server 00:03:19.944 [36/36] Linking target test/unit_tests 00:03:19.944 INFO: autodetecting backend as ninja 00:03:19.944 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:19.944 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:20.202 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:20.202 ninja: no work to do. 00:03:25.468 CC lib/ut/ut.o 00:03:25.468 CC lib/ut_mock/mock.o 00:03:25.468 CC lib/log/log.o 00:03:25.468 CC lib/log/log_flags.o 00:03:25.468 CC lib/log/log_deprecated.o 00:03:25.468 LIB libspdk_ut_mock.a 00:03:25.468 LIB libspdk_ut.a 00:03:25.468 LIB libspdk_log.a 00:03:25.726 CC lib/ioat/ioat.o 00:03:25.726 CXX lib/trace_parser/trace.o 00:03:25.726 CC lib/dma/dma.o 00:03:25.726 CC lib/util/base64.o 00:03:25.726 CC lib/util/bit_array.o 00:03:25.726 CC lib/util/cpuset.o 00:03:25.726 CC lib/util/crc16.o 00:03:25.726 CC lib/util/crc32.o 00:03:25.726 CC lib/util/crc32c.o 00:03:25.726 CC lib/util/crc32_ieee.o 00:03:25.726 CC lib/util/crc64.o 00:03:25.726 CC lib/util/dif.o 00:03:25.726 CC lib/util/fd.o 00:03:25.726 CC lib/util/file.o 00:03:25.726 CC lib/util/iov.o 00:03:25.726 CC lib/util/hexlify.o 00:03:25.726 CC lib/util/math.o 00:03:25.727 CC lib/util/pipe.o 00:03:25.727 CC lib/util/strerror_tls.o 00:03:25.727 CC lib/util/string.o 00:03:25.727 CC lib/util/uuid.o 00:03:25.727 CC lib/util/fd_group.o 00:03:25.727 CC lib/util/xor.o 00:03:25.727 CC lib/util/zipf.o 00:03:25.985 CC lib/vfio_user/host/vfio_user_pci.o 00:03:25.985 CC lib/vfio_user/host/vfio_user.o 00:03:25.985 LIB libspdk_dma.a 00:03:25.985 LIB libspdk_ioat.a 00:03:25.985 LIB libspdk_vfio_user.a 00:03:26.245 LIB libspdk_util.a 00:03:26.503 LIB libspdk_trace_parser.a 00:03:26.503 CC lib/rdma/common.o 00:03:26.503 CC lib/rdma/rdma_verbs.o 00:03:26.503 CC lib/vmd/vmd.o 00:03:26.503 CC lib/vmd/led.o 00:03:26.503 CC lib/json/json_parse.o 00:03:26.503 CC lib/json/json_util.o 00:03:26.503 CC lib/json/json_write.o 00:03:26.503 CC lib/conf/conf.o 00:03:26.503 CC lib/env_dpdk/env.o 00:03:26.503 CC lib/env_dpdk/pci.o 00:03:26.503 CC lib/idxd/idxd.o 00:03:26.503 CC lib/env_dpdk/memory.o 00:03:26.503 CC lib/idxd/idxd_user.o 00:03:26.503 CC lib/env_dpdk/init.o 00:03:26.503 CC lib/idxd/idxd_kernel.o 00:03:26.503 CC lib/env_dpdk/threads.o 00:03:26.503 CC lib/env_dpdk/pci_vmd.o 00:03:26.503 CC lib/env_dpdk/pci_ioat.o 00:03:26.503 CC lib/env_dpdk/pci_virtio.o 00:03:26.503 CC lib/env_dpdk/pci_idxd.o 00:03:26.503 CC lib/env_dpdk/pci_dpdk.o 00:03:26.503 CC lib/env_dpdk/pci_event.o 00:03:26.503 CC lib/env_dpdk/sigbus_handler.o 00:03:26.503 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:26.503 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:26.762 LIB libspdk_conf.a 00:03:26.762 LIB libspdk_json.a 00:03:26.762 LIB libspdk_rdma.a 00:03:27.020 LIB libspdk_vmd.a 00:03:27.020 LIB libspdk_idxd.a 00:03:27.020 CC lib/jsonrpc/jsonrpc_server.o 00:03:27.020 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:27.020 CC lib/jsonrpc/jsonrpc_client.o 00:03:27.020 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:27.279 LIB libspdk_jsonrpc.a 00:03:27.538 CC lib/rpc/rpc.o 00:03:27.797 LIB libspdk_rpc.a 00:03:27.797 LIB libspdk_env_dpdk.a 00:03:28.056 CC lib/sock/sock.o 00:03:28.056 CC lib/sock/sock_rpc.o 00:03:28.056 CC lib/notify/notify.o 00:03:28.056 CC lib/notify/notify_rpc.o 00:03:28.056 CC lib/trace/trace.o 00:03:28.056 CC lib/trace/trace_flags.o 00:03:28.056 CC lib/trace/trace_rpc.o 00:03:28.315 LIB libspdk_notify.a 00:03:28.315 LIB libspdk_trace.a 00:03:28.315 LIB libspdk_sock.a 00:03:28.574 CC lib/thread/thread.o 00:03:28.574 CC lib/thread/iobuf.o 00:03:28.832 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:28.832 CC lib/nvme/nvme_ctrlr.o 00:03:28.832 CC lib/nvme/nvme_fabric.o 00:03:28.832 CC lib/nvme/nvme_ns_cmd.o 00:03:28.832 CC lib/nvme/nvme_ns.o 00:03:28.832 CC lib/nvme/nvme_pcie_common.o 00:03:28.832 CC lib/nvme/nvme_pcie.o 00:03:28.832 CC lib/nvme/nvme_qpair.o 00:03:28.832 CC lib/nvme/nvme.o 00:03:28.832 CC lib/nvme/nvme_quirks.o 00:03:28.832 CC lib/nvme/nvme_transport.o 00:03:28.832 CC lib/nvme/nvme_discovery.o 00:03:28.832 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:28.832 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:28.832 CC lib/nvme/nvme_tcp.o 00:03:28.832 CC lib/nvme/nvme_opal.o 00:03:28.832 CC lib/nvme/nvme_io_msg.o 00:03:28.832 CC lib/nvme/nvme_poll_group.o 00:03:28.832 CC lib/nvme/nvme_zns.o 00:03:28.832 CC lib/nvme/nvme_cuse.o 00:03:28.832 CC lib/nvme/nvme_vfio_user.o 00:03:28.832 CC lib/nvme/nvme_rdma.o 00:03:29.766 LIB libspdk_thread.a 00:03:30.025 CC lib/virtio/virtio.o 00:03:30.025 CC lib/virtio/virtio_vhost_user.o 00:03:30.025 CC lib/virtio/virtio_vfio_user.o 00:03:30.025 CC lib/init/json_config.o 00:03:30.025 CC lib/accel/accel.o 00:03:30.025 CC lib/virtio/virtio_pci.o 00:03:30.025 CC lib/init/subsystem.o 00:03:30.025 CC lib/vfu_tgt/tgt_endpoint.o 00:03:30.025 CC lib/init/subsystem_rpc.o 00:03:30.025 CC lib/accel/accel_rpc.o 00:03:30.025 CC lib/vfu_tgt/tgt_rpc.o 00:03:30.025 CC lib/accel/accel_sw.o 00:03:30.025 CC lib/init/rpc.o 00:03:30.025 CC lib/blob/blobstore.o 00:03:30.025 CC lib/blob/blob_bs_dev.o 00:03:30.025 CC lib/blob/request.o 00:03:30.025 CC lib/blob/zeroes.o 00:03:30.284 LIB libspdk_init.a 00:03:30.284 LIB libspdk_virtio.a 00:03:30.284 LIB libspdk_vfu_tgt.a 00:03:30.543 LIB libspdk_nvme.a 00:03:30.543 CC lib/event/app.o 00:03:30.543 CC lib/event/reactor.o 00:03:30.543 CC lib/event/log_rpc.o 00:03:30.543 CC lib/event/app_rpc.o 00:03:30.543 CC lib/event/scheduler_static.o 00:03:31.111 LIB libspdk_event.a 00:03:31.111 LIB libspdk_accel.a 00:03:31.370 CC lib/bdev/bdev.o 00:03:31.370 CC lib/bdev/bdev_rpc.o 00:03:31.370 CC lib/bdev/bdev_zone.o 00:03:31.370 CC lib/bdev/scsi_nvme.o 00:03:31.370 CC lib/bdev/part.o 00:03:32.305 LIB libspdk_blob.a 00:03:32.873 CC lib/blobfs/blobfs.o 00:03:32.873 CC lib/blobfs/tree.o 00:03:32.873 CC lib/lvol/lvol.o 00:03:33.441 LIB libspdk_lvol.a 00:03:33.441 LIB libspdk_blobfs.a 00:03:33.700 LIB libspdk_bdev.a 00:03:34.268 CC lib/nvmf/ctrlr.o 00:03:34.268 CC lib/nvmf/ctrlr_discovery.o 00:03:34.268 CC lib/ftl/ftl_core.o 00:03:34.268 CC lib/nvmf/ctrlr_bdev.o 00:03:34.268 CC lib/nvmf/subsystem.o 00:03:34.268 CC lib/ftl/ftl_init.o 00:03:34.268 CC lib/ftl/ftl_layout.o 00:03:34.268 CC lib/nvmf/nvmf.o 00:03:34.268 CC lib/nvmf/nvmf_rpc.o 00:03:34.268 CC lib/scsi/dev.o 00:03:34.268 CC lib/ftl/ftl_debug.o 00:03:34.268 CC lib/scsi/port.o 00:03:34.268 CC lib/nvmf/transport.o 00:03:34.268 CC lib/ftl/ftl_io.o 00:03:34.268 CC lib/scsi/lun.o 00:03:34.268 CC lib/nvmf/tcp.o 00:03:34.268 CC lib/nbd/nbd.o 00:03:34.268 CC lib/ftl/ftl_sb.o 00:03:34.268 CC lib/nvmf/vfio_user.o 00:03:34.268 CC lib/ftl/ftl_l2p.o 00:03:34.268 CC lib/scsi/scsi.o 00:03:34.268 CC lib/ublk/ublk.o 00:03:34.268 CC lib/nbd/nbd_rpc.o 00:03:34.268 CC lib/ublk/ublk_rpc.o 00:03:34.268 CC lib/ftl/ftl_l2p_flat.o 00:03:34.268 CC lib/nvmf/rdma.o 00:03:34.268 CC lib/scsi/scsi_bdev.o 00:03:34.268 CC lib/scsi/scsi_pr.o 00:03:34.268 CC lib/ftl/ftl_nv_cache.o 00:03:34.268 CC lib/ftl/ftl_band.o 00:03:34.268 CC lib/scsi/scsi_rpc.o 00:03:34.268 CC lib/ftl/ftl_band_ops.o 00:03:34.268 CC lib/ftl/ftl_writer.o 00:03:34.268 CC lib/ftl/ftl_rq.o 00:03:34.268 CC lib/scsi/task.o 00:03:34.268 CC lib/ftl/ftl_reloc.o 00:03:34.268 CC lib/ftl/ftl_l2p_cache.o 00:03:34.268 CC lib/ftl/ftl_p2l.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:34.268 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:34.268 CC lib/ftl/utils/ftl_conf.o 00:03:34.268 CC lib/ftl/utils/ftl_md.o 00:03:34.268 CC lib/ftl/utils/ftl_mempool.o 00:03:34.268 CC lib/ftl/utils/ftl_bitmap.o 00:03:34.268 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:34.268 CC lib/ftl/utils/ftl_property.o 00:03:34.268 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:34.268 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:34.268 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:34.268 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:34.268 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:34.268 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:34.268 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:34.268 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:34.268 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:34.268 CC lib/ftl/base/ftl_base_dev.o 00:03:34.268 CC lib/ftl/base/ftl_base_bdev.o 00:03:34.268 CC lib/ftl/ftl_trace.o 00:03:34.837 LIB libspdk_nbd.a 00:03:34.837 LIB libspdk_scsi.a 00:03:34.837 LIB libspdk_ublk.a 00:03:35.096 CC lib/vhost/vhost.o 00:03:35.096 CC lib/vhost/vhost_scsi.o 00:03:35.096 CC lib/vhost/vhost_rpc.o 00:03:35.096 CC lib/vhost/vhost_blk.o 00:03:35.096 CC lib/vhost/rte_vhost_user.o 00:03:35.096 CC lib/iscsi/conn.o 00:03:35.096 CC lib/iscsi/init_grp.o 00:03:35.096 CC lib/iscsi/iscsi.o 00:03:35.096 CC lib/iscsi/md5.o 00:03:35.096 CC lib/iscsi/param.o 00:03:35.096 CC lib/iscsi/portal_grp.o 00:03:35.096 CC lib/iscsi/iscsi_rpc.o 00:03:35.096 CC lib/iscsi/tgt_node.o 00:03:35.096 CC lib/iscsi/iscsi_subsystem.o 00:03:35.096 CC lib/iscsi/task.o 00:03:35.096 LIB libspdk_ftl.a 00:03:36.039 LIB libspdk_nvmf.a 00:03:36.039 LIB libspdk_vhost.a 00:03:36.299 LIB libspdk_iscsi.a 00:03:36.866 CC module/vfu_device/vfu_virtio.o 00:03:36.866 CC module/vfu_device/vfu_virtio_blk.o 00:03:36.866 CC module/vfu_device/vfu_virtio_scsi.o 00:03:36.866 CC module/vfu_device/vfu_virtio_rpc.o 00:03:36.866 CC module/env_dpdk/env_dpdk_rpc.o 00:03:36.866 CC module/accel/ioat/accel_ioat_rpc.o 00:03:36.866 CC module/accel/ioat/accel_ioat.o 00:03:36.866 CC module/accel/dsa/accel_dsa.o 00:03:36.866 CC module/accel/iaa/accel_iaa.o 00:03:36.866 CC module/accel/error/accel_error_rpc.o 00:03:36.866 CC module/accel/error/accel_error.o 00:03:36.866 CC module/accel/iaa/accel_iaa_rpc.o 00:03:36.866 CC module/accel/dsa/accel_dsa_rpc.o 00:03:36.866 CC module/blob/bdev/blob_bdev.o 00:03:36.866 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:36.866 CC module/sock/posix/posix.o 00:03:36.866 LIB libspdk_env_dpdk_rpc.a 00:03:36.866 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:36.866 CC module/scheduler/gscheduler/gscheduler.o 00:03:37.125 LIB libspdk_scheduler_dpdk_governor.a 00:03:37.125 LIB libspdk_accel_error.a 00:03:37.125 LIB libspdk_scheduler_gscheduler.a 00:03:37.125 LIB libspdk_accel_ioat.a 00:03:37.125 LIB libspdk_scheduler_dynamic.a 00:03:37.125 LIB libspdk_accel_iaa.a 00:03:37.125 LIB libspdk_accel_dsa.a 00:03:37.125 LIB libspdk_blob_bdev.a 00:03:37.125 LIB libspdk_vfu_device.a 00:03:37.384 LIB libspdk_sock_posix.a 00:03:37.384 CC module/bdev/gpt/gpt.o 00:03:37.384 CC module/bdev/gpt/vbdev_gpt.o 00:03:37.384 CC module/bdev/malloc/bdev_malloc.o 00:03:37.384 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:37.384 CC module/bdev/null/bdev_null.o 00:03:37.384 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:37.384 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:37.384 CC module/bdev/null/bdev_null_rpc.o 00:03:37.384 CC module/bdev/ftl/bdev_ftl.o 00:03:37.384 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:37.384 CC module/bdev/lvol/vbdev_lvol.o 00:03:37.384 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:37.384 CC module/bdev/delay/vbdev_delay.o 00:03:37.384 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:37.384 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:37.384 CC module/bdev/nvme/bdev_nvme.o 00:03:37.384 CC module/bdev/nvme/bdev_mdns_client.o 00:03:37.384 CC module/bdev/nvme/vbdev_opal.o 00:03:37.384 CC module/bdev/nvme/nvme_rpc.o 00:03:37.384 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:37.642 CC module/blobfs/bdev/blobfs_bdev.o 00:03:37.642 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:37.642 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:37.642 CC module/bdev/raid/bdev_raid.o 00:03:37.642 CC module/bdev/split/vbdev_split.o 00:03:37.642 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:37.642 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:37.642 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:37.642 CC module/bdev/split/vbdev_split_rpc.o 00:03:37.642 CC module/bdev/raid/bdev_raid_rpc.o 00:03:37.642 CC module/bdev/raid/bdev_raid_sb.o 00:03:37.642 CC module/bdev/passthru/vbdev_passthru.o 00:03:37.642 CC module/bdev/raid/raid1.o 00:03:37.642 CC module/bdev/raid/raid0.o 00:03:37.642 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:37.642 CC module/bdev/raid/concat.o 00:03:37.642 CC module/bdev/aio/bdev_aio.o 00:03:37.642 CC module/bdev/aio/bdev_aio_rpc.o 00:03:37.642 CC module/bdev/error/vbdev_error.o 00:03:37.642 CC module/bdev/error/vbdev_error_rpc.o 00:03:37.642 CC module/bdev/iscsi/bdev_iscsi.o 00:03:37.642 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:37.642 LIB libspdk_bdev_gpt.a 00:03:37.642 LIB libspdk_bdev_split.a 00:03:37.642 LIB libspdk_bdev_null.a 00:03:37.901 LIB libspdk_bdev_ftl.a 00:03:37.901 LIB libspdk_blobfs_bdev.a 00:03:37.901 LIB libspdk_bdev_zone_block.a 00:03:37.901 LIB libspdk_bdev_aio.a 00:03:37.901 LIB libspdk_bdev_passthru.a 00:03:37.901 LIB libspdk_bdev_iscsi.a 00:03:37.901 LIB libspdk_bdev_error.a 00:03:37.901 LIB libspdk_bdev_lvol.a 00:03:37.901 LIB libspdk_bdev_virtio.a 00:03:37.901 LIB libspdk_bdev_malloc.a 00:03:37.901 LIB libspdk_bdev_delay.a 00:03:38.160 LIB libspdk_bdev_raid.a 00:03:39.189 LIB libspdk_bdev_nvme.a 00:03:39.757 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:40.016 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:40.016 CC module/event/subsystems/vmd/vmd.o 00:03:40.016 CC module/event/subsystems/sock/sock.o 00:03:40.016 CC module/event/subsystems/scheduler/scheduler.o 00:03:40.016 CC module/event/subsystems/iobuf/iobuf.o 00:03:40.016 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:40.016 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:40.016 LIB libspdk_event_vfu_tgt.a 00:03:40.016 LIB libspdk_event_vmd.a 00:03:40.016 LIB libspdk_event_sock.a 00:03:40.016 LIB libspdk_event_scheduler.a 00:03:40.016 LIB libspdk_event_vhost_blk.a 00:03:40.016 LIB libspdk_event_iobuf.a 00:03:40.275 CC module/event/subsystems/accel/accel.o 00:03:40.535 LIB libspdk_event_accel.a 00:03:40.794 CC module/event/subsystems/bdev/bdev.o 00:03:41.054 LIB libspdk_event_bdev.a 00:03:41.313 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:41.313 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:41.313 CC module/event/subsystems/scsi/scsi.o 00:03:41.313 CC module/event/subsystems/nbd/nbd.o 00:03:41.313 CC module/event/subsystems/ublk/ublk.o 00:03:41.313 LIB libspdk_event_nbd.a 00:03:41.313 LIB libspdk_event_scsi.a 00:03:41.313 LIB libspdk_event_ublk.a 00:03:41.571 LIB libspdk_event_nvmf.a 00:03:41.829 CC module/event/subsystems/iscsi/iscsi.o 00:03:41.829 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:41.829 LIB libspdk_event_vhost_scsi.a 00:03:41.829 LIB libspdk_event_iscsi.a 00:03:42.089 CC test/rpc_client/rpc_client_test.o 00:03:42.089 CXX app/trace/trace.o 00:03:42.089 CC app/spdk_lspci/spdk_lspci.o 00:03:42.089 CC app/trace_record/trace_record.o 00:03:42.089 TEST_HEADER include/spdk/accel.h 00:03:42.089 TEST_HEADER include/spdk/accel_module.h 00:03:42.089 CC app/spdk_nvme_discover/discovery_aer.o 00:03:42.089 CC app/spdk_nvme_identify/identify.o 00:03:42.089 CC app/spdk_top/spdk_top.o 00:03:42.352 TEST_HEADER include/spdk/assert.h 00:03:42.352 CC app/spdk_nvme_perf/perf.o 00:03:42.352 TEST_HEADER include/spdk/barrier.h 00:03:42.352 TEST_HEADER include/spdk/base64.h 00:03:42.352 TEST_HEADER include/spdk/bdev.h 00:03:42.352 TEST_HEADER include/spdk/bdev_module.h 00:03:42.352 TEST_HEADER include/spdk/bdev_zone.h 00:03:42.352 TEST_HEADER include/spdk/bit_array.h 00:03:42.352 TEST_HEADER include/spdk/bit_pool.h 00:03:42.352 TEST_HEADER include/spdk/blob_bdev.h 00:03:42.352 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:42.352 TEST_HEADER include/spdk/blobfs.h 00:03:42.352 TEST_HEADER include/spdk/blob.h 00:03:42.352 TEST_HEADER include/spdk/conf.h 00:03:42.352 TEST_HEADER include/spdk/config.h 00:03:42.352 TEST_HEADER include/spdk/cpuset.h 00:03:42.352 TEST_HEADER include/spdk/crc16.h 00:03:42.352 TEST_HEADER include/spdk/crc32.h 00:03:42.352 TEST_HEADER include/spdk/crc64.h 00:03:42.352 TEST_HEADER include/spdk/dif.h 00:03:42.352 TEST_HEADER include/spdk/dma.h 00:03:42.352 TEST_HEADER include/spdk/endian.h 00:03:42.352 CC app/nvmf_tgt/nvmf_main.o 00:03:42.352 TEST_HEADER include/spdk/env_dpdk.h 00:03:42.352 TEST_HEADER include/spdk/env.h 00:03:42.352 CC app/spdk_dd/spdk_dd.o 00:03:42.352 TEST_HEADER include/spdk/event.h 00:03:42.352 CC app/iscsi_tgt/iscsi_tgt.o 00:03:42.352 TEST_HEADER include/spdk/fd_group.h 00:03:42.352 CC app/vhost/vhost.o 00:03:42.352 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:42.352 TEST_HEADER include/spdk/fd.h 00:03:42.352 TEST_HEADER include/spdk/file.h 00:03:42.352 CC test/thread/lock/spdk_lock.o 00:03:42.352 TEST_HEADER include/spdk/ftl.h 00:03:42.352 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:42.352 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:42.352 TEST_HEADER include/spdk/gpt_spec.h 00:03:42.352 CC app/spdk_tgt/spdk_tgt.o 00:03:42.352 CC test/app/histogram_perf/histogram_perf.o 00:03:42.352 TEST_HEADER include/spdk/hexlify.h 00:03:42.352 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:42.352 TEST_HEADER include/spdk/histogram_data.h 00:03:42.352 CC test/thread/poller_perf/poller_perf.o 00:03:42.352 CC test/nvme/err_injection/err_injection.o 00:03:42.352 CC test/env/memory/memory_ut.o 00:03:42.352 CC examples/nvme/reconnect/reconnect.o 00:03:42.352 TEST_HEADER include/spdk/idxd.h 00:03:42.352 CC examples/vmd/lsvmd/lsvmd.o 00:03:42.352 CC test/env/pci/pci_ut.o 00:03:42.352 CC examples/accel/perf/accel_perf.o 00:03:42.352 CC test/env/vtophys/vtophys.o 00:03:42.352 TEST_HEADER include/spdk/idxd_spec.h 00:03:42.352 CC test/nvme/e2edp/nvme_dp.o 00:03:42.352 CC test/nvme/fused_ordering/fused_ordering.o 00:03:42.352 TEST_HEADER include/spdk/init.h 00:03:42.352 CC examples/nvme/arbitration/arbitration.o 00:03:42.352 CC examples/nvme/hotplug/hotplug.o 00:03:42.352 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:42.352 TEST_HEADER include/spdk/ioat.h 00:03:42.352 CC examples/ioat/verify/verify.o 00:03:42.352 CC test/nvme/compliance/nvme_compliance.o 00:03:42.352 CC test/event/event_perf/event_perf.o 00:03:42.352 CC test/nvme/boot_partition/boot_partition.o 00:03:42.352 TEST_HEADER include/spdk/ioat_spec.h 00:03:42.352 CC examples/nvme/hello_world/hello_world.o 00:03:42.352 CC test/event/reactor/reactor.o 00:03:42.352 CC test/app/jsoncat/jsoncat.o 00:03:42.352 CC app/fio/nvme/fio_plugin.o 00:03:42.352 CC examples/vmd/led/led.o 00:03:42.352 TEST_HEADER include/spdk/iscsi_spec.h 00:03:42.352 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:42.352 CC examples/sock/hello_world/hello_sock.o 00:03:42.352 CC test/nvme/connect_stress/connect_stress.o 00:03:42.352 CC test/nvme/startup/startup.o 00:03:42.352 TEST_HEADER include/spdk/json.h 00:03:42.352 CC test/nvme/sgl/sgl.o 00:03:42.352 CC test/nvme/aer/aer.o 00:03:42.352 CC examples/ioat/perf/perf.o 00:03:42.352 CC test/nvme/simple_copy/simple_copy.o 00:03:42.352 CC test/nvme/overhead/overhead.o 00:03:42.352 CC test/nvme/reserve/reserve.o 00:03:42.352 CC examples/nvme/abort/abort.o 00:03:42.352 CC test/nvme/reset/reset.o 00:03:42.352 TEST_HEADER include/spdk/jsonrpc.h 00:03:42.352 CC examples/util/zipf/zipf.o 00:03:42.352 CC test/app/stub/stub.o 00:03:42.352 CC test/event/reactor_perf/reactor_perf.o 00:03:42.352 TEST_HEADER include/spdk/likely.h 00:03:42.352 CC examples/idxd/perf/perf.o 00:03:42.352 TEST_HEADER include/spdk/log.h 00:03:42.352 TEST_HEADER include/spdk/lvol.h 00:03:42.352 TEST_HEADER include/spdk/memory.h 00:03:42.352 TEST_HEADER include/spdk/mmio.h 00:03:42.352 CC test/event/app_repeat/app_repeat.o 00:03:42.352 TEST_HEADER include/spdk/nbd.h 00:03:42.352 TEST_HEADER include/spdk/notify.h 00:03:42.352 TEST_HEADER include/spdk/nvme.h 00:03:42.352 TEST_HEADER include/spdk/nvme_intel.h 00:03:42.352 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:42.352 CC examples/nvmf/nvmf/nvmf.o 00:03:42.352 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:42.352 TEST_HEADER include/spdk/nvme_spec.h 00:03:42.352 CC test/accel/dif/dif.o 00:03:42.352 CC test/dma/test_dma/test_dma.o 00:03:42.352 TEST_HEADER include/spdk/nvme_zns.h 00:03:42.352 CC test/bdev/bdevio/bdevio.o 00:03:42.352 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:42.352 CC examples/bdev/hello_world/hello_bdev.o 00:03:42.352 CC test/blobfs/mkfs/mkfs.o 00:03:42.352 CC test/event/scheduler/scheduler.o 00:03:42.352 CC examples/thread/thread/thread_ex.o 00:03:42.352 LINK spdk_lspci 00:03:42.352 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:42.352 TEST_HEADER include/spdk/nvmf.h 00:03:42.352 CC test/lvol/esnap/esnap.o 00:03:42.352 CC examples/blob/hello_world/hello_blob.o 00:03:42.352 TEST_HEADER include/spdk/nvmf_spec.h 00:03:42.352 LINK spdk_nvme_discover 00:03:42.352 TEST_HEADER include/spdk/nvmf_transport.h 00:03:42.614 CC test/env/mem_callbacks/mem_callbacks.o 00:03:42.614 TEST_HEADER include/spdk/opal.h 00:03:42.614 TEST_HEADER include/spdk/opal_spec.h 00:03:42.614 CC test/app/bdev_svc/bdev_svc.o 00:03:42.614 TEST_HEADER include/spdk/pci_ids.h 00:03:42.614 TEST_HEADER include/spdk/pipe.h 00:03:42.614 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:42.614 TEST_HEADER include/spdk/queue.h 00:03:42.614 TEST_HEADER include/spdk/reduce.h 00:03:42.614 TEST_HEADER include/spdk/rpc.h 00:03:42.614 LINK rpc_client_test 00:03:42.614 TEST_HEADER include/spdk/scheduler.h 00:03:42.614 TEST_HEADER include/spdk/scsi.h 00:03:42.614 TEST_HEADER include/spdk/scsi_spec.h 00:03:42.614 TEST_HEADER include/spdk/sock.h 00:03:42.614 LINK spdk_trace_record 00:03:42.614 TEST_HEADER include/spdk/stdinc.h 00:03:42.614 TEST_HEADER include/spdk/string.h 00:03:42.614 TEST_HEADER include/spdk/thread.h 00:03:42.614 TEST_HEADER include/spdk/trace.h 00:03:42.614 TEST_HEADER include/spdk/trace_parser.h 00:03:42.614 TEST_HEADER include/spdk/tree.h 00:03:42.614 TEST_HEADER include/spdk/ublk.h 00:03:42.614 TEST_HEADER include/spdk/util.h 00:03:42.614 TEST_HEADER include/spdk/uuid.h 00:03:42.614 TEST_HEADER include/spdk/version.h 00:03:42.614 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:42.614 LINK histogram_perf 00:03:42.614 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:42.614 LINK poller_perf 00:03:42.614 TEST_HEADER include/spdk/vhost.h 00:03:42.614 TEST_HEADER include/spdk/vmd.h 00:03:42.614 TEST_HEADER include/spdk/xor.h 00:03:42.614 LINK event_perf 00:03:42.614 TEST_HEADER include/spdk/zipf.h 00:03:42.614 CXX test/cpp_headers/accel.o 00:03:42.614 LINK lsvmd 00:03:42.614 LINK env_dpdk_post_init 00:03:42.614 LINK jsoncat 00:03:42.614 LINK reactor 00:03:42.614 LINK vtophys 00:03:42.614 LINK connect_stress 00:03:42.614 LINK led 00:03:42.614 LINK reactor_perf 00:03:42.614 LINK nvmf_tgt 00:03:42.614 LINK doorbell_aers 00:03:42.614 LINK zipf 00:03:42.614 LINK vhost 00:03:42.614 LINK boot_partition 00:03:42.614 LINK interrupt_tgt 00:03:42.614 LINK app_repeat 00:03:42.614 LINK startup 00:03:42.614 LINK err_injection 00:03:42.614 LINK stub 00:03:42.614 LINK iscsi_tgt 00:03:42.614 LINK fused_ordering 00:03:42.614 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:42.614 struct spdk_nvme_fdp_ruhs ruhs; 00:03:42.614 ^ 00:03:42.614 LINK reserve 00:03:42.614 LINK pmr_persistence 00:03:42.614 LINK verify 00:03:42.614 LINK cmb_copy 00:03:42.872 LINK ioat_perf 00:03:42.872 LINK hello_world 00:03:42.872 LINK spdk_tgt 00:03:42.872 LINK sgl 00:03:42.872 LINK hotplug 00:03:42.872 LINK simple_copy 00:03:42.872 LINK hello_sock 00:03:42.872 LINK bdev_svc 00:03:42.872 LINK reset 00:03:42.872 LINK mkfs 00:03:42.872 LINK aer 00:03:42.872 LINK nvme_dp 00:03:42.872 LINK reconnect 00:03:42.872 LINK scheduler 00:03:42.872 LINK overhead 00:03:42.872 LINK hello_bdev 00:03:42.872 CXX test/cpp_headers/accel_module.o 00:03:42.872 LINK hello_blob 00:03:42.872 LINK thread 00:03:42.872 LINK spdk_trace 00:03:42.872 LINK pci_ut 00:03:42.872 LINK test_dma 00:03:42.872 LINK idxd_perf 00:03:42.872 LINK accel_perf 00:03:42.872 LINK nvmf 00:03:42.872 LINK arbitration 00:03:42.872 LINK abort 00:03:43.133 LINK bdevio 00:03:43.133 LINK nvme_manage 00:03:43.133 LINK nvme_compliance 00:03:43.133 CXX test/cpp_headers/assert.o 00:03:43.133 LINK spdk_dd 00:03:43.133 LINK dif 00:03:43.133 LINK nvme_fuzz 00:03:43.133 1 warning generated. 00:03:43.393 LINK spdk_nvme 00:03:43.393 CC test/nvme/fdp/fdp.o 00:03:43.393 LINK mem_callbacks 00:03:43.393 LINK spdk_nvme_identify 00:03:43.393 CXX test/cpp_headers/barrier.o 00:03:43.393 CXX test/cpp_headers/base64.o 00:03:43.393 CC test/nvme/cuse/cuse.o 00:03:43.393 LINK spdk_nvme_perf 00:03:43.393 CXX test/cpp_headers/bdev.o 00:03:43.655 CXX test/cpp_headers/bdev_module.o 00:03:43.655 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:43.655 CXX test/cpp_headers/bdev_zone.o 00:03:43.655 LINK fdp 00:03:43.655 CC examples/bdev/bdevperf/bdevperf.o 00:03:43.655 CC app/fio/bdev/fio_plugin.o 00:03:43.655 CC examples/blob/cli/blobcli.o 00:03:43.655 LINK spdk_top 00:03:43.655 CXX test/cpp_headers/bit_array.o 00:03:43.655 CXX test/cpp_headers/bit_pool.o 00:03:43.655 CXX test/cpp_headers/blob_bdev.o 00:03:43.655 CXX test/cpp_headers/blobfs_bdev.o 00:03:43.922 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:43.922 LINK memory_ut 00:03:43.922 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:43.922 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:43.922 CXX test/cpp_headers/blobfs.o 00:03:43.922 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:43.922 CXX test/cpp_headers/blob.o 00:03:43.922 CXX test/cpp_headers/conf.o 00:03:43.922 CXX test/cpp_headers/config.o 00:03:43.922 CXX test/cpp_headers/cpuset.o 00:03:43.922 CXX test/cpp_headers/crc16.o 00:03:43.922 CXX test/cpp_headers/crc32.o 00:03:43.922 CXX test/cpp_headers/crc64.o 00:03:43.922 CXX test/cpp_headers/dif.o 00:03:43.922 CXX test/cpp_headers/dma.o 00:03:43.922 CXX test/cpp_headers/endian.o 00:03:43.922 CXX test/cpp_headers/env_dpdk.o 00:03:43.922 CXX test/cpp_headers/env.o 00:03:44.182 CXX test/cpp_headers/event.o 00:03:44.182 CXX test/cpp_headers/fd_group.o 00:03:44.182 CXX test/cpp_headers/fd.o 00:03:44.182 CXX test/cpp_headers/ftl.o 00:03:44.182 CXX test/cpp_headers/file.o 00:03:44.182 CXX test/cpp_headers/gpt_spec.o 00:03:44.182 CXX test/cpp_headers/hexlify.o 00:03:44.182 CXX test/cpp_headers/histogram_data.o 00:03:44.182 CXX test/cpp_headers/idxd.o 00:03:44.182 CXX test/cpp_headers/idxd_spec.o 00:03:44.182 CXX test/cpp_headers/init.o 00:03:44.182 CXX test/cpp_headers/ioat.o 00:03:44.182 CXX test/cpp_headers/ioat_spec.o 00:03:44.182 CXX test/cpp_headers/iscsi_spec.o 00:03:44.182 CXX test/cpp_headers/json.o 00:03:44.182 CXX test/cpp_headers/jsonrpc.o 00:03:44.182 CXX test/cpp_headers/likely.o 00:03:44.182 CXX test/cpp_headers/log.o 00:03:44.182 CXX test/cpp_headers/lvol.o 00:03:44.182 CXX test/cpp_headers/memory.o 00:03:44.182 CXX test/cpp_headers/mmio.o 00:03:44.182 CXX test/cpp_headers/nbd.o 00:03:44.182 CXX test/cpp_headers/notify.o 00:03:44.182 CXX test/cpp_headers/nvme.o 00:03:44.182 CXX test/cpp_headers/nvme_intel.o 00:03:44.182 CXX test/cpp_headers/nvme_ocssd.o 00:03:44.182 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:44.182 CXX test/cpp_headers/nvme_spec.o 00:03:44.182 LINK spdk_bdev 00:03:44.440 CXX test/cpp_headers/nvme_zns.o 00:03:44.441 CXX test/cpp_headers/nvmf_cmd.o 00:03:44.441 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:44.441 LINK llvm_vfio_fuzz 00:03:44.441 CXX test/cpp_headers/nvmf.o 00:03:44.441 CXX test/cpp_headers/nvmf_spec.o 00:03:44.441 CXX test/cpp_headers/nvmf_transport.o 00:03:44.441 CXX test/cpp_headers/opal.o 00:03:44.441 CXX test/cpp_headers/opal_spec.o 00:03:44.441 CXX test/cpp_headers/pci_ids.o 00:03:44.441 CXX test/cpp_headers/pipe.o 00:03:44.441 CXX test/cpp_headers/queue.o 00:03:44.441 CXX test/cpp_headers/reduce.o 00:03:44.441 LINK blobcli 00:03:44.441 CXX test/cpp_headers/rpc.o 00:03:44.441 CXX test/cpp_headers/scheduler.o 00:03:44.441 CXX test/cpp_headers/scsi.o 00:03:44.441 CXX test/cpp_headers/scsi_spec.o 00:03:44.441 CXX test/cpp_headers/sock.o 00:03:44.441 CXX test/cpp_headers/stdinc.o 00:03:44.441 CXX test/cpp_headers/string.o 00:03:44.441 CXX test/cpp_headers/thread.o 00:03:44.441 CXX test/cpp_headers/trace.o 00:03:44.441 CXX test/cpp_headers/trace_parser.o 00:03:44.441 CXX test/cpp_headers/tree.o 00:03:44.441 CXX test/cpp_headers/ublk.o 00:03:44.441 CXX test/cpp_headers/util.o 00:03:44.441 CXX test/cpp_headers/uuid.o 00:03:44.441 CXX test/cpp_headers/version.o 00:03:44.441 CXX test/cpp_headers/vfio_user_pci.o 00:03:44.441 CXX test/cpp_headers/vfio_user_spec.o 00:03:44.441 CXX test/cpp_headers/vhost.o 00:03:44.441 CXX test/cpp_headers/vmd.o 00:03:44.441 CXX test/cpp_headers/xor.o 00:03:44.441 CXX test/cpp_headers/zipf.o 00:03:44.441 LINK vhost_fuzz 00:03:44.441 LINK spdk_lock 00:03:44.698 LINK cuse 00:03:44.698 LINK bdevperf 00:03:44.698 LINK llvm_nvme_fuzz 00:03:45.299 LINK iscsi_fuzz 00:03:47.839 LINK esnap 00:03:48.098 00:03:48.098 real 0m31.392s 00:03:48.098 user 6m0.334s 00:03:48.098 sys 1m57.090s 00:03:48.098 11:58:01 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:48.098 11:58:01 -- common/autotest_common.sh@10 -- $ set +x 00:03:48.098 ************************************ 00:03:48.098 END TEST make 00:03:48.098 ************************************ 00:03:48.357 11:58:01 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:48.357 11:58:01 -- nvmf/common.sh@7 -- # uname -s 00:03:48.357 11:58:01 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:48.357 11:58:01 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:48.357 11:58:01 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:48.357 11:58:01 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:48.357 11:58:01 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:48.357 11:58:01 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:48.357 11:58:01 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:48.357 11:58:01 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:48.357 11:58:01 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:48.357 11:58:01 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:48.357 11:58:01 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:03:48.357 11:58:01 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:03:48.357 11:58:01 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:48.357 11:58:01 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:48.357 11:58:01 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:48.357 11:58:01 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:48.357 11:58:01 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:48.357 11:58:01 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:48.357 11:58:01 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:48.357 11:58:01 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.357 11:58:01 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.357 11:58:01 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.357 11:58:01 -- paths/export.sh@5 -- # export PATH 00:03:48.357 11:58:01 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:48.357 11:58:01 -- nvmf/common.sh@46 -- # : 0 00:03:48.357 11:58:01 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:48.357 11:58:01 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:48.357 11:58:01 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:48.357 11:58:01 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:48.357 11:58:01 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:48.357 11:58:01 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:48.357 11:58:01 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:48.357 11:58:01 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:48.357 11:58:01 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:48.357 11:58:01 -- spdk/autotest.sh@32 -- # uname -s 00:03:48.357 11:58:01 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:48.357 11:58:01 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:48.357 11:58:01 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:48.357 11:58:01 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:48.357 11:58:01 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:48.357 11:58:01 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:48.357 11:58:01 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:48.357 11:58:01 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:48.357 11:58:01 -- spdk/autotest.sh@48 -- # udevadm_pid=2618436 00:03:48.357 11:58:01 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:48.357 11:58:01 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:48.357 11:58:01 -- spdk/autotest.sh@54 -- # echo 2618438 00:03:48.357 11:58:01 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:48.357 11:58:01 -- spdk/autotest.sh@56 -- # echo 2618439 00:03:48.357 11:58:01 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:48.357 11:58:01 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:48.357 11:58:01 -- spdk/autotest.sh@60 -- # echo 2618440 00:03:48.357 11:58:01 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:48.357 11:58:01 -- spdk/autotest.sh@62 -- # echo 2618441 00:03:48.357 11:58:01 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:48.357 11:58:01 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:48.357 11:58:01 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:48.357 11:58:01 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:48.357 11:58:01 -- common/autotest_common.sh@10 -- # set +x 00:03:48.357 11:58:01 -- spdk/autotest.sh@70 -- # create_test_list 00:03:48.357 11:58:01 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:48.357 11:58:01 -- common/autotest_common.sh@10 -- # set +x 00:03:48.357 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:48.357 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:48.357 11:58:01 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:48.357 11:58:01 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:48.357 11:58:01 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:48.357 11:58:01 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:48.357 11:58:01 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:48.357 11:58:01 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:48.357 11:58:01 -- common/autotest_common.sh@1440 -- # uname 00:03:48.357 11:58:01 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:48.357 11:58:01 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:48.357 11:58:01 -- common/autotest_common.sh@1460 -- # uname 00:03:48.357 11:58:01 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:48.357 11:58:01 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:48.357 11:58:01 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:48.357 11:58:01 -- spdk/autotest.sh@83 -- # hash lcov 00:03:48.357 11:58:01 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:48.357 11:58:01 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:48.357 11:58:01 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:48.357 11:58:01 -- common/autotest_common.sh@10 -- # set +x 00:03:48.357 11:58:01 -- spdk/autotest.sh@102 -- # rm -f 00:03:48.357 11:58:01 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:52.550 0000:1a:00.0 (8086 0a54): Already using the nvme driver 00:03:52.550 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:52.550 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:52.809 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:52.809 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:52.809 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:54.713 11:58:07 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:54.713 11:58:07 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:54.713 11:58:07 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:54.713 11:58:07 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:54.713 11:58:07 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:54.713 11:58:07 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:54.713 11:58:07 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:54.713 11:58:07 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:54.713 11:58:07 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:54.713 11:58:07 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:54.713 11:58:07 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:54.713 11:58:07 -- spdk/autotest.sh@121 -- # grep -v p 00:03:54.713 11:58:07 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:54.713 11:58:07 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:54.713 11:58:07 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:54.713 11:58:07 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:54.713 11:58:07 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:54.713 No valid GPT data, bailing 00:03:54.713 11:58:07 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:54.713 11:58:07 -- scripts/common.sh@393 -- # pt= 00:03:54.713 11:58:07 -- scripts/common.sh@394 -- # return 1 00:03:54.713 11:58:07 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:54.713 1+0 records in 00:03:54.713 1+0 records out 00:03:54.713 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00630914 s, 166 MB/s 00:03:54.713 11:58:07 -- spdk/autotest.sh@129 -- # sync 00:03:54.713 11:58:07 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:54.713 11:58:07 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:54.713 11:58:07 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:59.991 11:58:12 -- spdk/autotest.sh@135 -- # uname -s 00:03:59.991 11:58:12 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:59.991 11:58:12 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:59.991 11:58:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:59.991 11:58:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:59.991 11:58:12 -- common/autotest_common.sh@10 -- # set +x 00:03:59.991 ************************************ 00:03:59.991 START TEST setup.sh 00:03:59.991 ************************************ 00:03:59.992 11:58:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:59.992 * Looking for test storage... 00:03:59.992 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:59.992 11:58:12 -- setup/test-setup.sh@10 -- # uname -s 00:03:59.992 11:58:12 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:59.992 11:58:12 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:59.992 11:58:12 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:59.992 11:58:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:59.992 11:58:12 -- common/autotest_common.sh@10 -- # set +x 00:03:59.992 ************************************ 00:03:59.992 START TEST acl 00:03:59.992 ************************************ 00:03:59.992 11:58:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:59.992 * Looking for test storage... 00:03:59.992 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:59.992 11:58:12 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:59.992 11:58:12 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:59.992 11:58:12 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:59.992 11:58:12 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:59.992 11:58:12 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:59.992 11:58:12 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:59.992 11:58:12 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:59.992 11:58:12 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:59.992 11:58:12 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:59.992 11:58:12 -- setup/acl.sh@12 -- # devs=() 00:03:59.992 11:58:12 -- setup/acl.sh@12 -- # declare -a devs 00:03:59.992 11:58:12 -- setup/acl.sh@13 -- # drivers=() 00:03:59.992 11:58:12 -- setup/acl.sh@13 -- # declare -A drivers 00:03:59.992 11:58:12 -- setup/acl.sh@51 -- # setup reset 00:03:59.992 11:58:12 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:59.992 11:58:12 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:06.560 11:58:19 -- setup/acl.sh@52 -- # collect_setup_devs 00:04:06.560 11:58:19 -- setup/acl.sh@16 -- # local dev driver 00:04:06.560 11:58:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:06.560 11:58:19 -- setup/acl.sh@15 -- # setup output status 00:04:06.560 11:58:19 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.560 11:58:19 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:10.757 Hugepages 00:04:10.757 node hugesize free / total 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # continue 00:04:10.757 11:58:22 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # continue 00:04:10.757 11:58:22 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # continue 00:04:10.757 11:58:22 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 00:04:10.757 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:10.757 11:58:22 -- setup/acl.sh@19 -- # continue 00:04:10.757 11:58:22 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:1a:00.0 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:10.757 11:58:23 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.757 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.757 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.757 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:10.758 11:58:23 -- setup/acl.sh@20 -- # continue 00:04:10.758 11:58:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:10.758 11:58:23 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:10.758 11:58:23 -- setup/acl.sh@54 -- # run_test denied denied 00:04:10.758 11:58:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:10.758 11:58:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:10.758 11:58:23 -- common/autotest_common.sh@10 -- # set +x 00:04:10.758 ************************************ 00:04:10.758 START TEST denied 00:04:10.758 ************************************ 00:04:10.758 11:58:23 -- common/autotest_common.sh@1104 -- # denied 00:04:10.758 11:58:23 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:1a:00.0' 00:04:10.758 11:58:23 -- setup/acl.sh@38 -- # setup output config 00:04:10.758 11:58:23 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:1a:00.0' 00:04:10.758 11:58:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:10.758 11:58:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:17.328 0000:1a:00.0 (8086 0a54): Skipping denied controller at 0000:1a:00.0 00:04:17.328 11:58:29 -- setup/acl.sh@40 -- # verify 0000:1a:00.0 00:04:17.328 11:58:29 -- setup/acl.sh@28 -- # local dev driver 00:04:17.328 11:58:29 -- setup/acl.sh@30 -- # for dev in "$@" 00:04:17.328 11:58:29 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:1a:00.0 ]] 00:04:17.328 11:58:29 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:1a:00.0/driver 00:04:17.328 11:58:29 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:17.328 11:58:29 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:17.328 11:58:29 -- setup/acl.sh@41 -- # setup reset 00:04:17.328 11:58:29 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:17.328 11:58:29 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:23.900 00:04:23.900 real 0m13.289s 00:04:23.900 user 0m4.256s 00:04:23.900 sys 0m8.296s 00:04:23.900 11:58:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:23.900 11:58:36 -- common/autotest_common.sh@10 -- # set +x 00:04:23.900 ************************************ 00:04:23.900 END TEST denied 00:04:23.900 ************************************ 00:04:23.900 11:58:36 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:23.900 11:58:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:23.900 11:58:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:23.900 11:58:36 -- common/autotest_common.sh@10 -- # set +x 00:04:23.900 ************************************ 00:04:23.901 START TEST allowed 00:04:23.901 ************************************ 00:04:23.901 11:58:36 -- common/autotest_common.sh@1104 -- # allowed 00:04:23.901 11:58:36 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:1a:00.0 00:04:23.901 11:58:36 -- setup/acl.sh@45 -- # setup output config 00:04:23.901 11:58:36 -- setup/acl.sh@46 -- # grep -E '0000:1a:00.0 .*: nvme -> .*' 00:04:23.901 11:58:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:23.901 11:58:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:33.885 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:04:33.885 11:58:45 -- setup/acl.sh@47 -- # verify 00:04:33.885 11:58:45 -- setup/acl.sh@28 -- # local dev driver 00:04:33.885 11:58:45 -- setup/acl.sh@48 -- # setup reset 00:04:33.885 11:58:45 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:33.885 11:58:45 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:39.188 00:04:39.188 real 0m15.524s 00:04:39.188 user 0m4.243s 00:04:39.188 sys 0m8.166s 00:04:39.188 11:58:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:39.188 11:58:52 -- common/autotest_common.sh@10 -- # set +x 00:04:39.188 ************************************ 00:04:39.188 END TEST allowed 00:04:39.188 ************************************ 00:04:39.188 00:04:39.188 real 0m39.469s 00:04:39.188 user 0m12.208s 00:04:39.188 sys 0m23.682s 00:04:39.188 11:58:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:39.188 11:58:52 -- common/autotest_common.sh@10 -- # set +x 00:04:39.188 ************************************ 00:04:39.188 END TEST acl 00:04:39.188 ************************************ 00:04:39.188 11:58:52 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:39.188 11:58:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:39.188 11:58:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:39.188 11:58:52 -- common/autotest_common.sh@10 -- # set +x 00:04:39.188 ************************************ 00:04:39.188 START TEST hugepages 00:04:39.188 ************************************ 00:04:39.188 11:58:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:39.481 * Looking for test storage... 00:04:39.481 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:39.481 11:58:52 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:39.481 11:58:52 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:39.481 11:58:52 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:39.481 11:58:52 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:39.481 11:58:52 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:39.481 11:58:52 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:39.481 11:58:52 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:39.481 11:58:52 -- setup/common.sh@18 -- # local node= 00:04:39.481 11:58:52 -- setup/common.sh@19 -- # local var val 00:04:39.481 11:58:52 -- setup/common.sh@20 -- # local mem_f mem 00:04:39.481 11:58:52 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.481 11:58:52 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.481 11:58:52 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.481 11:58:52 -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.481 11:58:52 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.481 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 11:58:52 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 66835176 kB' 'MemAvailable: 70739304 kB' 'Buffers: 8112 kB' 'Cached: 17654140 kB' 'SwapCached: 0 kB' 'Active: 14536536 kB' 'Inactive: 3720140 kB' 'Active(anon): 13989316 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597876 kB' 'Mapped: 189448 kB' 'Shmem: 13394892 kB' 'KReclaimable: 472736 kB' 'Slab: 881332 kB' 'SReclaimable: 472736 kB' 'SUnreclaim: 408596 kB' 'KernelStack: 16496 kB' 'PageTables: 9248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438208 kB' 'Committed_AS: 15461260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214360 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:39.481 11:58:52 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.481 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.481 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.481 11:58:52 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.481 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.481 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.481 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.482 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.482 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # continue 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # IFS=': ' 00:04:39.483 11:58:52 -- setup/common.sh@31 -- # read -r var val _ 00:04:39.483 11:58:52 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:39.483 11:58:52 -- setup/common.sh@33 -- # echo 2048 00:04:39.483 11:58:52 -- setup/common.sh@33 -- # return 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:39.483 11:58:52 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:39.483 11:58:52 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:39.483 11:58:52 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:39.483 11:58:52 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:39.483 11:58:52 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:39.483 11:58:52 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:39.483 11:58:52 -- setup/hugepages.sh@207 -- # get_nodes 00:04:39.483 11:58:52 -- setup/hugepages.sh@27 -- # local node 00:04:39.483 11:58:52 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.483 11:58:52 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:39.483 11:58:52 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.483 11:58:52 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:39.483 11:58:52 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:39.483 11:58:52 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:39.483 11:58:52 -- setup/hugepages.sh@208 -- # clear_hp 00:04:39.483 11:58:52 -- setup/hugepages.sh@37 -- # local node hp 00:04:39.483 11:58:52 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:39.483 11:58:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:39.483 11:58:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:39.483 11:58:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:39.483 11:58:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:39.483 11:58:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:39.483 11:58:52 -- setup/hugepages.sh@41 -- # echo 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:39.483 11:58:52 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:39.483 11:58:52 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:39.483 11:58:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:39.483 11:58:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:39.483 11:58:52 -- common/autotest_common.sh@10 -- # set +x 00:04:39.483 ************************************ 00:04:39.483 START TEST default_setup 00:04:39.483 ************************************ 00:04:39.483 11:58:52 -- common/autotest_common.sh@1104 -- # default_setup 00:04:39.483 11:58:52 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:39.483 11:58:52 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:39.483 11:58:52 -- setup/hugepages.sh@51 -- # shift 00:04:39.483 11:58:52 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:39.483 11:58:52 -- setup/hugepages.sh@52 -- # local node_ids 00:04:39.483 11:58:52 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:39.483 11:58:52 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:39.483 11:58:52 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:39.483 11:58:52 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:39.483 11:58:52 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:39.483 11:58:52 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:39.483 11:58:52 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:39.483 11:58:52 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:39.483 11:58:52 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:39.483 11:58:52 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:39.483 11:58:52 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:39.483 11:58:52 -- setup/hugepages.sh@73 -- # return 0 00:04:39.483 11:58:52 -- setup/hugepages.sh@137 -- # setup output 00:04:39.483 11:58:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.483 11:58:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:43.672 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:43.672 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:46.961 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:04:48.870 11:59:01 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:48.870 11:59:01 -- setup/hugepages.sh@89 -- # local node 00:04:48.870 11:59:01 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:48.870 11:59:01 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:48.870 11:59:01 -- setup/hugepages.sh@92 -- # local surp 00:04:48.870 11:59:01 -- setup/hugepages.sh@93 -- # local resv 00:04:48.870 11:59:01 -- setup/hugepages.sh@94 -- # local anon 00:04:48.870 11:59:01 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:48.870 11:59:01 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:48.870 11:59:01 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:48.870 11:59:01 -- setup/common.sh@18 -- # local node= 00:04:48.870 11:59:01 -- setup/common.sh@19 -- # local var val 00:04:48.870 11:59:01 -- setup/common.sh@20 -- # local mem_f mem 00:04:48.870 11:59:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.870 11:59:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.870 11:59:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.870 11:59:01 -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.870 11:59:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.870 11:59:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69041900 kB' 'MemAvailable: 72945956 kB' 'Buffers: 8112 kB' 'Cached: 17654316 kB' 'SwapCached: 0 kB' 'Active: 14553168 kB' 'Inactive: 3720140 kB' 'Active(anon): 14005948 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614264 kB' 'Mapped: 189328 kB' 'Shmem: 13395068 kB' 'KReclaimable: 472664 kB' 'Slab: 880964 kB' 'SReclaimable: 472664 kB' 'SUnreclaim: 408300 kB' 'KernelStack: 16432 kB' 'PageTables: 9532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15476008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214184 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:48.870 11:59:01 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.870 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.870 11:59:01 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.870 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.870 11:59:01 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.870 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.870 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.871 11:59:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.871 11:59:01 -- setup/common.sh@33 -- # echo 0 00:04:48.871 11:59:01 -- setup/common.sh@33 -- # return 0 00:04:48.871 11:59:01 -- setup/hugepages.sh@97 -- # anon=0 00:04:48.871 11:59:01 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:48.871 11:59:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.871 11:59:01 -- setup/common.sh@18 -- # local node= 00:04:48.871 11:59:01 -- setup/common.sh@19 -- # local var val 00:04:48.871 11:59:01 -- setup/common.sh@20 -- # local mem_f mem 00:04:48.871 11:59:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.871 11:59:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.871 11:59:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.871 11:59:01 -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.871 11:59:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.871 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69045088 kB' 'MemAvailable: 72949144 kB' 'Buffers: 8112 kB' 'Cached: 17654320 kB' 'SwapCached: 0 kB' 'Active: 14553040 kB' 'Inactive: 3720140 kB' 'Active(anon): 14005820 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614128 kB' 'Mapped: 189432 kB' 'Shmem: 13395072 kB' 'KReclaimable: 472664 kB' 'Slab: 881004 kB' 'SReclaimable: 472664 kB' 'SUnreclaim: 408340 kB' 'KernelStack: 16352 kB' 'PageTables: 9296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15476020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214152 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.872 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.872 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.873 11:59:01 -- setup/common.sh@33 -- # echo 0 00:04:48.873 11:59:01 -- setup/common.sh@33 -- # return 0 00:04:48.873 11:59:01 -- setup/hugepages.sh@99 -- # surp=0 00:04:48.873 11:59:01 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:48.873 11:59:01 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:48.873 11:59:01 -- setup/common.sh@18 -- # local node= 00:04:48.873 11:59:01 -- setup/common.sh@19 -- # local var val 00:04:48.873 11:59:01 -- setup/common.sh@20 -- # local mem_f mem 00:04:48.873 11:59:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.873 11:59:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.873 11:59:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.873 11:59:01 -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.873 11:59:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69045592 kB' 'MemAvailable: 72949648 kB' 'Buffers: 8112 kB' 'Cached: 17654320 kB' 'SwapCached: 0 kB' 'Active: 14553072 kB' 'Inactive: 3720140 kB' 'Active(anon): 14005852 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614156 kB' 'Mapped: 189432 kB' 'Shmem: 13395072 kB' 'KReclaimable: 472664 kB' 'Slab: 881004 kB' 'SReclaimable: 472664 kB' 'SUnreclaim: 408340 kB' 'KernelStack: 16368 kB' 'PageTables: 9344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15476036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214152 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.873 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.873 11:59:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.874 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.874 11:59:01 -- setup/common.sh@33 -- # echo 0 00:04:48.874 11:59:01 -- setup/common.sh@33 -- # return 0 00:04:48.874 11:59:01 -- setup/hugepages.sh@100 -- # resv=0 00:04:48.874 11:59:01 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:48.874 nr_hugepages=1024 00:04:48.874 11:59:01 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:48.874 resv_hugepages=0 00:04:48.874 11:59:01 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:48.874 surplus_hugepages=0 00:04:48.874 11:59:01 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:48.874 anon_hugepages=0 00:04:48.874 11:59:01 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:48.874 11:59:01 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:48.874 11:59:01 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:48.874 11:59:01 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:48.874 11:59:01 -- setup/common.sh@18 -- # local node= 00:04:48.874 11:59:01 -- setup/common.sh@19 -- # local var val 00:04:48.874 11:59:01 -- setup/common.sh@20 -- # local mem_f mem 00:04:48.874 11:59:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.874 11:59:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.874 11:59:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.874 11:59:01 -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.874 11:59:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.874 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69045428 kB' 'MemAvailable: 72949484 kB' 'Buffers: 8112 kB' 'Cached: 17654356 kB' 'SwapCached: 0 kB' 'Active: 14552716 kB' 'Inactive: 3720140 kB' 'Active(anon): 14005496 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 613728 kB' 'Mapped: 189432 kB' 'Shmem: 13395108 kB' 'KReclaimable: 472664 kB' 'Slab: 880996 kB' 'SReclaimable: 472664 kB' 'SUnreclaim: 408332 kB' 'KernelStack: 16336 kB' 'PageTables: 9248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15476052 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214168 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.875 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.875 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.876 11:59:01 -- setup/common.sh@33 -- # echo 1024 00:04:48.876 11:59:01 -- setup/common.sh@33 -- # return 0 00:04:48.876 11:59:01 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:48.876 11:59:01 -- setup/hugepages.sh@112 -- # get_nodes 00:04:48.876 11:59:01 -- setup/hugepages.sh@27 -- # local node 00:04:48.876 11:59:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:48.876 11:59:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:48.876 11:59:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:48.876 11:59:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:48.876 11:59:01 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:48.876 11:59:01 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:48.876 11:59:01 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:48.876 11:59:01 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:48.876 11:59:01 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:48.876 11:59:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.876 11:59:01 -- setup/common.sh@18 -- # local node=0 00:04:48.876 11:59:01 -- setup/common.sh@19 -- # local var val 00:04:48.876 11:59:01 -- setup/common.sh@20 -- # local mem_f mem 00:04:48.876 11:59:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.876 11:59:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:48.876 11:59:01 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:48.876 11:59:01 -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.876 11:59:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 28766724 kB' 'MemUsed: 19303152 kB' 'SwapCached: 0 kB' 'Active: 11776528 kB' 'Inactive: 3587012 kB' 'Active(anon): 11515020 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14896764 kB' 'Mapped: 111624 kB' 'AnonPages: 469936 kB' 'Shmem: 11048244 kB' 'KernelStack: 9096 kB' 'PageTables: 5388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 514968 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 226888 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.876 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.876 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # continue 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:48.877 11:59:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:48.877 11:59:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.877 11:59:01 -- setup/common.sh@33 -- # echo 0 00:04:48.877 11:59:01 -- setup/common.sh@33 -- # return 0 00:04:48.877 11:59:01 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:48.877 11:59:01 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:48.877 11:59:01 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:48.877 11:59:01 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:48.877 11:59:01 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:48.877 node0=1024 expecting 1024 00:04:48.877 11:59:01 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:48.877 00:04:48.877 real 0m9.564s 00:04:48.877 user 0m2.314s 00:04:48.877 sys 0m4.196s 00:04:48.877 11:59:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.877 11:59:01 -- common/autotest_common.sh@10 -- # set +x 00:04:48.877 ************************************ 00:04:48.877 END TEST default_setup 00:04:48.877 ************************************ 00:04:49.137 11:59:01 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:49.137 11:59:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:49.137 11:59:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:49.137 11:59:01 -- common/autotest_common.sh@10 -- # set +x 00:04:49.137 ************************************ 00:04:49.137 START TEST per_node_1G_alloc 00:04:49.137 ************************************ 00:04:49.137 11:59:01 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:04:49.137 11:59:01 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:49.137 11:59:01 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:49.137 11:59:01 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:49.137 11:59:01 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:49.137 11:59:01 -- setup/hugepages.sh@51 -- # shift 00:04:49.137 11:59:01 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:49.137 11:59:01 -- setup/hugepages.sh@52 -- # local node_ids 00:04:49.137 11:59:01 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:49.137 11:59:01 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:49.137 11:59:01 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:49.137 11:59:01 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:49.137 11:59:01 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:49.137 11:59:01 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:49.137 11:59:01 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:49.137 11:59:01 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:49.137 11:59:01 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:49.137 11:59:01 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:49.137 11:59:01 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:49.137 11:59:01 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:49.137 11:59:01 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:49.137 11:59:01 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:49.137 11:59:01 -- setup/hugepages.sh@73 -- # return 0 00:04:49.137 11:59:01 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:49.137 11:59:01 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:49.137 11:59:01 -- setup/hugepages.sh@146 -- # setup output 00:04:49.137 11:59:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.137 11:59:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:53.328 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:53.328 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:53.328 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.237 11:59:07 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:55.237 11:59:07 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:55.237 11:59:07 -- setup/hugepages.sh@89 -- # local node 00:04:55.237 11:59:07 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:55.237 11:59:07 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:55.237 11:59:07 -- setup/hugepages.sh@92 -- # local surp 00:04:55.237 11:59:07 -- setup/hugepages.sh@93 -- # local resv 00:04:55.237 11:59:07 -- setup/hugepages.sh@94 -- # local anon 00:04:55.237 11:59:07 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:55.237 11:59:07 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:55.237 11:59:07 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:55.237 11:59:07 -- setup/common.sh@18 -- # local node= 00:04:55.237 11:59:07 -- setup/common.sh@19 -- # local var val 00:04:55.237 11:59:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:55.237 11:59:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.237 11:59:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.237 11:59:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.237 11:59:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.237 11:59:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.237 11:59:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69057184 kB' 'MemAvailable: 72961272 kB' 'Buffers: 8112 kB' 'Cached: 17654464 kB' 'SwapCached: 0 kB' 'Active: 14551232 kB' 'Inactive: 3720140 kB' 'Active(anon): 14004012 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 612144 kB' 'Mapped: 188608 kB' 'Shmem: 13395216 kB' 'KReclaimable: 472696 kB' 'Slab: 881640 kB' 'SReclaimable: 472696 kB' 'SUnreclaim: 408944 kB' 'KernelStack: 16320 kB' 'PageTables: 8736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15467844 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214328 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.237 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.237 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.238 11:59:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.238 11:59:07 -- setup/common.sh@33 -- # echo 0 00:04:55.238 11:59:07 -- setup/common.sh@33 -- # return 0 00:04:55.238 11:59:07 -- setup/hugepages.sh@97 -- # anon=0 00:04:55.238 11:59:07 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:55.238 11:59:07 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.238 11:59:07 -- setup/common.sh@18 -- # local node= 00:04:55.238 11:59:07 -- setup/common.sh@19 -- # local var val 00:04:55.238 11:59:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:55.238 11:59:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.238 11:59:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.238 11:59:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.238 11:59:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.238 11:59:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.238 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69060108 kB' 'MemAvailable: 72964196 kB' 'Buffers: 8112 kB' 'Cached: 17654464 kB' 'SwapCached: 0 kB' 'Active: 14551344 kB' 'Inactive: 3720140 kB' 'Active(anon): 14004124 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 612308 kB' 'Mapped: 188660 kB' 'Shmem: 13395216 kB' 'KReclaimable: 472696 kB' 'Slab: 881696 kB' 'SReclaimable: 472696 kB' 'SUnreclaim: 409000 kB' 'KernelStack: 16288 kB' 'PageTables: 8648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15467100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214232 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.239 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.239 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.240 11:59:07 -- setup/common.sh@33 -- # echo 0 00:04:55.240 11:59:07 -- setup/common.sh@33 -- # return 0 00:04:55.240 11:59:07 -- setup/hugepages.sh@99 -- # surp=0 00:04:55.240 11:59:07 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:55.240 11:59:07 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:55.240 11:59:07 -- setup/common.sh@18 -- # local node= 00:04:55.240 11:59:07 -- setup/common.sh@19 -- # local var val 00:04:55.240 11:59:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:55.240 11:59:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.240 11:59:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.240 11:59:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.240 11:59:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.240 11:59:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69069976 kB' 'MemAvailable: 72974064 kB' 'Buffers: 8112 kB' 'Cached: 17654476 kB' 'SwapCached: 0 kB' 'Active: 14552124 kB' 'Inactive: 3720140 kB' 'Active(anon): 14004904 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 613076 kB' 'Mapped: 188664 kB' 'Shmem: 13395228 kB' 'KReclaimable: 472696 kB' 'Slab: 881700 kB' 'SReclaimable: 472696 kB' 'SUnreclaim: 409004 kB' 'KernelStack: 16336 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15471176 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.240 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.240 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.241 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.241 11:59:07 -- setup/common.sh@33 -- # echo 0 00:04:55.241 11:59:07 -- setup/common.sh@33 -- # return 0 00:04:55.241 11:59:07 -- setup/hugepages.sh@100 -- # resv=0 00:04:55.241 11:59:07 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:55.241 nr_hugepages=1024 00:04:55.241 11:59:07 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:55.241 resv_hugepages=0 00:04:55.241 11:59:07 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:55.241 surplus_hugepages=0 00:04:55.241 11:59:07 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:55.241 anon_hugepages=0 00:04:55.241 11:59:07 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:55.241 11:59:07 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:55.241 11:59:07 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:55.241 11:59:07 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:55.241 11:59:07 -- setup/common.sh@18 -- # local node= 00:04:55.241 11:59:07 -- setup/common.sh@19 -- # local var val 00:04:55.241 11:59:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:55.241 11:59:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.241 11:59:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.241 11:59:07 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.241 11:59:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.241 11:59:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.241 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69068748 kB' 'MemAvailable: 72972836 kB' 'Buffers: 8112 kB' 'Cached: 17654492 kB' 'SwapCached: 0 kB' 'Active: 14552132 kB' 'Inactive: 3720140 kB' 'Active(anon): 14004912 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 613012 kB' 'Mapped: 188664 kB' 'Shmem: 13395244 kB' 'KReclaimable: 472696 kB' 'Slab: 881692 kB' 'SReclaimable: 472696 kB' 'SUnreclaim: 408996 kB' 'KernelStack: 16464 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15471316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214264 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.242 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.242 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.243 11:59:07 -- setup/common.sh@33 -- # echo 1024 00:04:55.243 11:59:07 -- setup/common.sh@33 -- # return 0 00:04:55.243 11:59:07 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:55.243 11:59:07 -- setup/hugepages.sh@112 -- # get_nodes 00:04:55.243 11:59:07 -- setup/hugepages.sh@27 -- # local node 00:04:55.243 11:59:07 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.243 11:59:07 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:55.243 11:59:07 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.243 11:59:07 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:55.243 11:59:07 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:55.243 11:59:07 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:55.243 11:59:07 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.243 11:59:07 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.243 11:59:07 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:55.243 11:59:07 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.243 11:59:07 -- setup/common.sh@18 -- # local node=0 00:04:55.243 11:59:07 -- setup/common.sh@19 -- # local var val 00:04:55.243 11:59:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:55.243 11:59:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.243 11:59:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:55.243 11:59:07 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:55.243 11:59:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.243 11:59:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 29818904 kB' 'MemUsed: 18250972 kB' 'SwapCached: 0 kB' 'Active: 11775668 kB' 'Inactive: 3587012 kB' 'Active(anon): 11514160 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14896820 kB' 'Mapped: 111344 kB' 'AnonPages: 469016 kB' 'Shmem: 11048300 kB' 'KernelStack: 9064 kB' 'PageTables: 5280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 515240 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 227160 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.243 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.243 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@33 -- # echo 0 00:04:55.244 11:59:07 -- setup/common.sh@33 -- # return 0 00:04:55.244 11:59:07 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.244 11:59:07 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.244 11:59:07 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.244 11:59:07 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:55.244 11:59:07 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.244 11:59:07 -- setup/common.sh@18 -- # local node=1 00:04:55.244 11:59:07 -- setup/common.sh@19 -- # local var val 00:04:55.244 11:59:07 -- setup/common.sh@20 -- # local mem_f mem 00:04:55.244 11:59:07 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.244 11:59:07 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:55.244 11:59:07 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:55.244 11:59:07 -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.244 11:59:07 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223636 kB' 'MemFree: 39248776 kB' 'MemUsed: 4974860 kB' 'SwapCached: 0 kB' 'Active: 2776912 kB' 'Inactive: 133128 kB' 'Active(anon): 2491200 kB' 'Inactive(anon): 0 kB' 'Active(file): 285712 kB' 'Inactive(file): 133128 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2765800 kB' 'Mapped: 77320 kB' 'AnonPages: 144460 kB' 'Shmem: 2346960 kB' 'KernelStack: 7432 kB' 'PageTables: 4120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 184616 kB' 'Slab: 366452 kB' 'SReclaimable: 184616 kB' 'SUnreclaim: 181836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.244 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.244 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # continue 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # IFS=': ' 00:04:55.245 11:59:07 -- setup/common.sh@31 -- # read -r var val _ 00:04:55.245 11:59:07 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.245 11:59:07 -- setup/common.sh@33 -- # echo 0 00:04:55.245 11:59:07 -- setup/common.sh@33 -- # return 0 00:04:55.245 11:59:07 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.245 11:59:07 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.245 11:59:07 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.245 11:59:07 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:55.245 node0=512 expecting 512 00:04:55.245 11:59:07 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.245 11:59:07 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.245 11:59:07 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.245 11:59:07 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:55.245 node1=512 expecting 512 00:04:55.245 11:59:07 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:55.245 00:04:55.245 real 0m6.026s 00:04:55.245 user 0m2.015s 00:04:55.245 sys 0m4.047s 00:04:55.245 11:59:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.245 11:59:07 -- common/autotest_common.sh@10 -- # set +x 00:04:55.245 ************************************ 00:04:55.245 END TEST per_node_1G_alloc 00:04:55.245 ************************************ 00:04:55.245 11:59:07 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:55.245 11:59:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:55.245 11:59:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:55.245 11:59:07 -- common/autotest_common.sh@10 -- # set +x 00:04:55.245 ************************************ 00:04:55.245 START TEST even_2G_alloc 00:04:55.245 ************************************ 00:04:55.245 11:59:07 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:55.245 11:59:07 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:55.245 11:59:07 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:55.245 11:59:07 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:55.245 11:59:07 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:55.245 11:59:07 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:55.245 11:59:07 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:55.245 11:59:07 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:55.245 11:59:07 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:55.245 11:59:07 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:55.245 11:59:07 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:55.245 11:59:07 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:55.245 11:59:07 -- setup/hugepages.sh@83 -- # : 512 00:04:55.245 11:59:07 -- setup/hugepages.sh@84 -- # : 1 00:04:55.245 11:59:07 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.245 11:59:07 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:55.245 11:59:07 -- setup/hugepages.sh@83 -- # : 0 00:04:55.245 11:59:07 -- setup/hugepages.sh@84 -- # : 0 00:04:55.245 11:59:07 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.245 11:59:08 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:55.245 11:59:08 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:55.245 11:59:08 -- setup/hugepages.sh@153 -- # setup output 00:04:55.245 11:59:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.246 11:59:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:58.537 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:58.537 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:58.537 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:58.796 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.702 11:59:13 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:05:00.702 11:59:13 -- setup/hugepages.sh@89 -- # local node 00:05:00.702 11:59:13 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:00.702 11:59:13 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:00.702 11:59:13 -- setup/hugepages.sh@92 -- # local surp 00:05:00.702 11:59:13 -- setup/hugepages.sh@93 -- # local resv 00:05:00.702 11:59:13 -- setup/hugepages.sh@94 -- # local anon 00:05:00.702 11:59:13 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:00.702 11:59:13 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:00.702 11:59:13 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:00.702 11:59:13 -- setup/common.sh@18 -- # local node= 00:05:00.702 11:59:13 -- setup/common.sh@19 -- # local var val 00:05:00.702 11:59:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:00.702 11:59:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.702 11:59:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.702 11:59:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.702 11:59:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.702 11:59:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.702 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.702 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.702 11:59:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69089776 kB' 'MemAvailable: 72993944 kB' 'Buffers: 8112 kB' 'Cached: 17654620 kB' 'SwapCached: 0 kB' 'Active: 14554360 kB' 'Inactive: 3720140 kB' 'Active(anon): 14007140 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615028 kB' 'Mapped: 188732 kB' 'Shmem: 13395372 kB' 'KReclaimable: 472776 kB' 'Slab: 881424 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408648 kB' 'KernelStack: 16480 kB' 'PageTables: 9064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15472272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214392 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:00.702 11:59:13 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.702 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.702 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.703 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.703 11:59:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.703 11:59:13 -- setup/common.sh@33 -- # echo 0 00:05:00.703 11:59:13 -- setup/common.sh@33 -- # return 0 00:05:00.703 11:59:13 -- setup/hugepages.sh@97 -- # anon=0 00:05:00.703 11:59:13 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:00.704 11:59:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.704 11:59:13 -- setup/common.sh@18 -- # local node= 00:05:00.704 11:59:13 -- setup/common.sh@19 -- # local var val 00:05:00.704 11:59:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:00.704 11:59:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.704 11:59:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.704 11:59:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.704 11:59:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.704 11:59:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69087624 kB' 'MemAvailable: 72991792 kB' 'Buffers: 8112 kB' 'Cached: 17654624 kB' 'SwapCached: 0 kB' 'Active: 14554724 kB' 'Inactive: 3720140 kB' 'Active(anon): 14007504 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615484 kB' 'Mapped: 188676 kB' 'Shmem: 13395376 kB' 'KReclaimable: 472776 kB' 'Slab: 881496 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408720 kB' 'KernelStack: 16512 kB' 'PageTables: 9424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15472284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214360 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.704 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.704 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.705 11:59:13 -- setup/common.sh@33 -- # echo 0 00:05:00.705 11:59:13 -- setup/common.sh@33 -- # return 0 00:05:00.705 11:59:13 -- setup/hugepages.sh@99 -- # surp=0 00:05:00.705 11:59:13 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:00.705 11:59:13 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:00.705 11:59:13 -- setup/common.sh@18 -- # local node= 00:05:00.705 11:59:13 -- setup/common.sh@19 -- # local var val 00:05:00.705 11:59:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:00.705 11:59:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.705 11:59:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.705 11:59:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.705 11:59:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.705 11:59:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69091704 kB' 'MemAvailable: 72995872 kB' 'Buffers: 8112 kB' 'Cached: 17654632 kB' 'SwapCached: 0 kB' 'Active: 14553820 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006600 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614520 kB' 'Mapped: 188676 kB' 'Shmem: 13395384 kB' 'KReclaimable: 472776 kB' 'Slab: 881368 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408592 kB' 'KernelStack: 16320 kB' 'PageTables: 8828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15468112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.705 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.705 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.706 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.706 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.707 11:59:13 -- setup/common.sh@33 -- # echo 0 00:05:00.707 11:59:13 -- setup/common.sh@33 -- # return 0 00:05:00.707 11:59:13 -- setup/hugepages.sh@100 -- # resv=0 00:05:00.707 11:59:13 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:00.707 nr_hugepages=1024 00:05:00.707 11:59:13 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:00.707 resv_hugepages=0 00:05:00.707 11:59:13 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:00.707 surplus_hugepages=0 00:05:00.707 11:59:13 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:00.707 anon_hugepages=0 00:05:00.707 11:59:13 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:00.707 11:59:13 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:00.707 11:59:13 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:00.707 11:59:13 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:00.707 11:59:13 -- setup/common.sh@18 -- # local node= 00:05:00.707 11:59:13 -- setup/common.sh@19 -- # local var val 00:05:00.707 11:59:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:00.707 11:59:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.707 11:59:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.707 11:59:13 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.707 11:59:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.707 11:59:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69090948 kB' 'MemAvailable: 72995116 kB' 'Buffers: 8112 kB' 'Cached: 17654648 kB' 'SwapCached: 0 kB' 'Active: 14553628 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006408 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614300 kB' 'Mapped: 188672 kB' 'Shmem: 13395400 kB' 'KReclaimable: 472776 kB' 'Slab: 881368 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408592 kB' 'KernelStack: 16400 kB' 'PageTables: 8840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15468124 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214264 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.707 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.707 11:59:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.708 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.708 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.709 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.709 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.709 11:59:13 -- setup/common.sh@33 -- # echo 1024 00:05:00.709 11:59:13 -- setup/common.sh@33 -- # return 0 00:05:00.709 11:59:13 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:00.709 11:59:13 -- setup/hugepages.sh@112 -- # get_nodes 00:05:00.709 11:59:13 -- setup/hugepages.sh@27 -- # local node 00:05:00.709 11:59:13 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.709 11:59:13 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:00.709 11:59:13 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.709 11:59:13 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:00.709 11:59:13 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:00.709 11:59:13 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:00.710 11:59:13 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:00.710 11:59:13 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:00.710 11:59:13 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:00.710 11:59:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.710 11:59:13 -- setup/common.sh@18 -- # local node=0 00:05:00.710 11:59:13 -- setup/common.sh@19 -- # local var val 00:05:00.710 11:59:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:00.710 11:59:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.710 11:59:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:00.710 11:59:13 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:00.710 11:59:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.710 11:59:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 29826040 kB' 'MemUsed: 18243836 kB' 'SwapCached: 0 kB' 'Active: 11775284 kB' 'Inactive: 3587012 kB' 'Active(anon): 11513776 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14896940 kB' 'Mapped: 111280 kB' 'AnonPages: 468560 kB' 'Shmem: 11048420 kB' 'KernelStack: 8984 kB' 'PageTables: 5036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 514992 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 226912 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.710 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.710 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.711 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.711 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.971 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.971 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.971 11:59:13 -- setup/common.sh@33 -- # echo 0 00:05:00.971 11:59:13 -- setup/common.sh@33 -- # return 0 00:05:00.971 11:59:13 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:00.971 11:59:13 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:00.971 11:59:13 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:00.971 11:59:13 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:00.971 11:59:13 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.971 11:59:13 -- setup/common.sh@18 -- # local node=1 00:05:00.971 11:59:13 -- setup/common.sh@19 -- # local var val 00:05:00.971 11:59:13 -- setup/common.sh@20 -- # local mem_f mem 00:05:00.971 11:59:13 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.971 11:59:13 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:00.971 11:59:13 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:00.971 11:59:13 -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.972 11:59:13 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223636 kB' 'MemFree: 39265472 kB' 'MemUsed: 4958164 kB' 'SwapCached: 0 kB' 'Active: 2777996 kB' 'Inactive: 133128 kB' 'Active(anon): 2492284 kB' 'Inactive(anon): 0 kB' 'Active(file): 285712 kB' 'Inactive(file): 133128 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2765848 kB' 'Mapped: 77392 kB' 'AnonPages: 145352 kB' 'Shmem: 2347008 kB' 'KernelStack: 7400 kB' 'PageTables: 3756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 184696 kB' 'Slab: 366376 kB' 'SReclaimable: 184696 kB' 'SUnreclaim: 181680 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # continue 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # IFS=': ' 00:05:00.972 11:59:13 -- setup/common.sh@31 -- # read -r var val _ 00:05:00.972 11:59:13 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.972 11:59:13 -- setup/common.sh@33 -- # echo 0 00:05:00.972 11:59:13 -- setup/common.sh@33 -- # return 0 00:05:00.972 11:59:13 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:00.972 11:59:13 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:00.973 11:59:13 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:00.973 11:59:13 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:00.973 11:59:13 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:00.973 node0=512 expecting 512 00:05:00.973 11:59:13 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:00.973 11:59:13 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:00.973 11:59:13 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:00.973 11:59:13 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:00.973 node1=512 expecting 512 00:05:00.973 11:59:13 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:00.973 00:05:00.973 real 0m5.777s 00:05:00.973 user 0m1.870s 00:05:00.973 sys 0m3.839s 00:05:00.973 11:59:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:00.973 11:59:13 -- common/autotest_common.sh@10 -- # set +x 00:05:00.973 ************************************ 00:05:00.973 END TEST even_2G_alloc 00:05:00.973 ************************************ 00:05:00.973 11:59:13 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:05:00.973 11:59:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:00.973 11:59:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:00.973 11:59:13 -- common/autotest_common.sh@10 -- # set +x 00:05:00.973 ************************************ 00:05:00.973 START TEST odd_alloc 00:05:00.973 ************************************ 00:05:00.973 11:59:13 -- common/autotest_common.sh@1104 -- # odd_alloc 00:05:00.973 11:59:13 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:05:00.973 11:59:13 -- setup/hugepages.sh@49 -- # local size=2098176 00:05:00.973 11:59:13 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:05:00.973 11:59:13 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:00.973 11:59:13 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:00.973 11:59:13 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:00.973 11:59:13 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:05:00.973 11:59:13 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:00.973 11:59:13 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:00.973 11:59:13 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:00.973 11:59:13 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:00.973 11:59:13 -- setup/hugepages.sh@83 -- # : 513 00:05:00.973 11:59:13 -- setup/hugepages.sh@84 -- # : 1 00:05:00.973 11:59:13 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:05:00.973 11:59:13 -- setup/hugepages.sh@83 -- # : 0 00:05:00.973 11:59:13 -- setup/hugepages.sh@84 -- # : 0 00:05:00.973 11:59:13 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:00.973 11:59:13 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:05:00.973 11:59:13 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:05:00.973 11:59:13 -- setup/hugepages.sh@160 -- # setup output 00:05:00.973 11:59:13 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.973 11:59:13 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:05.177 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:05.177 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:05.177 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.085 11:59:19 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:07.085 11:59:19 -- setup/hugepages.sh@89 -- # local node 00:05:07.085 11:59:19 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:07.085 11:59:19 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:07.085 11:59:19 -- setup/hugepages.sh@92 -- # local surp 00:05:07.085 11:59:19 -- setup/hugepages.sh@93 -- # local resv 00:05:07.085 11:59:19 -- setup/hugepages.sh@94 -- # local anon 00:05:07.085 11:59:19 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:07.085 11:59:19 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:07.085 11:59:19 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:07.085 11:59:19 -- setup/common.sh@18 -- # local node= 00:05:07.085 11:59:19 -- setup/common.sh@19 -- # local var val 00:05:07.085 11:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:05:07.085 11:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.085 11:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.085 11:59:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.085 11:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.085 11:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69061848 kB' 'MemAvailable: 72966016 kB' 'Buffers: 8112 kB' 'Cached: 17654772 kB' 'SwapCached: 0 kB' 'Active: 14554164 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006944 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614760 kB' 'Mapped: 188780 kB' 'Shmem: 13395524 kB' 'KReclaimable: 472776 kB' 'Slab: 881076 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408300 kB' 'KernelStack: 16304 kB' 'PageTables: 8668 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485760 kB' 'Committed_AS: 15468756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214328 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.085 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.085 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.086 11:59:19 -- setup/common.sh@33 -- # echo 0 00:05:07.086 11:59:19 -- setup/common.sh@33 -- # return 0 00:05:07.086 11:59:19 -- setup/hugepages.sh@97 -- # anon=0 00:05:07.086 11:59:19 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:07.086 11:59:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.086 11:59:19 -- setup/common.sh@18 -- # local node= 00:05:07.086 11:59:19 -- setup/common.sh@19 -- # local var val 00:05:07.086 11:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:05:07.086 11:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.086 11:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.086 11:59:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.086 11:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.086 11:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.086 11:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69065820 kB' 'MemAvailable: 72969988 kB' 'Buffers: 8112 kB' 'Cached: 17654776 kB' 'SwapCached: 0 kB' 'Active: 14554156 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006936 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614812 kB' 'Mapped: 188704 kB' 'Shmem: 13395528 kB' 'KReclaimable: 472776 kB' 'Slab: 881104 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408328 kB' 'KernelStack: 16320 kB' 'PageTables: 8776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485760 kB' 'Committed_AS: 15479140 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214280 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.086 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.086 11:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.087 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.087 11:59:19 -- setup/common.sh@33 -- # echo 0 00:05:07.087 11:59:19 -- setup/common.sh@33 -- # return 0 00:05:07.087 11:59:19 -- setup/hugepages.sh@99 -- # surp=0 00:05:07.087 11:59:19 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:07.087 11:59:19 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:07.087 11:59:19 -- setup/common.sh@18 -- # local node= 00:05:07.087 11:59:19 -- setup/common.sh@19 -- # local var val 00:05:07.087 11:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:05:07.087 11:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.087 11:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.087 11:59:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.087 11:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.087 11:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.087 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69067404 kB' 'MemAvailable: 72971572 kB' 'Buffers: 8112 kB' 'Cached: 17654792 kB' 'SwapCached: 0 kB' 'Active: 14553576 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006356 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614152 kB' 'Mapped: 188704 kB' 'Shmem: 13395544 kB' 'KReclaimable: 472776 kB' 'Slab: 881088 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408312 kB' 'KernelStack: 16240 kB' 'PageTables: 8444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485760 kB' 'Committed_AS: 15468420 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214232 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.088 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.088 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.089 11:59:19 -- setup/common.sh@33 -- # echo 0 00:05:07.089 11:59:19 -- setup/common.sh@33 -- # return 0 00:05:07.089 11:59:19 -- setup/hugepages.sh@100 -- # resv=0 00:05:07.089 11:59:19 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:07.089 nr_hugepages=1025 00:05:07.089 11:59:19 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:07.089 resv_hugepages=0 00:05:07.089 11:59:19 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:07.089 surplus_hugepages=0 00:05:07.089 11:59:19 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:07.089 anon_hugepages=0 00:05:07.089 11:59:19 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:07.089 11:59:19 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:07.089 11:59:19 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:07.089 11:59:19 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:07.089 11:59:19 -- setup/common.sh@18 -- # local node= 00:05:07.089 11:59:19 -- setup/common.sh@19 -- # local var val 00:05:07.089 11:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:05:07.089 11:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.089 11:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.089 11:59:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.089 11:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.089 11:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69067848 kB' 'MemAvailable: 72972016 kB' 'Buffers: 8112 kB' 'Cached: 17654796 kB' 'SwapCached: 0 kB' 'Active: 14553932 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006712 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614500 kB' 'Mapped: 188704 kB' 'Shmem: 13395548 kB' 'KReclaimable: 472776 kB' 'Slab: 881088 kB' 'SReclaimable: 472776 kB' 'SUnreclaim: 408312 kB' 'KernelStack: 16256 kB' 'PageTables: 8492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485760 kB' 'Committed_AS: 15468564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.089 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.089 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.090 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.090 11:59:19 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.091 11:59:19 -- setup/common.sh@33 -- # echo 1025 00:05:07.091 11:59:19 -- setup/common.sh@33 -- # return 0 00:05:07.091 11:59:19 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:07.091 11:59:19 -- setup/hugepages.sh@112 -- # get_nodes 00:05:07.091 11:59:19 -- setup/hugepages.sh@27 -- # local node 00:05:07.091 11:59:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.091 11:59:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:07.091 11:59:19 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.091 11:59:19 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:07.091 11:59:19 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:07.091 11:59:19 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:07.091 11:59:19 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:07.091 11:59:19 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:07.091 11:59:19 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:07.091 11:59:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.091 11:59:19 -- setup/common.sh@18 -- # local node=0 00:05:07.091 11:59:19 -- setup/common.sh@19 -- # local var val 00:05:07.091 11:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:05:07.091 11:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.091 11:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:07.091 11:59:19 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:07.091 11:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.091 11:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 29825392 kB' 'MemUsed: 18244484 kB' 'SwapCached: 0 kB' 'Active: 11776208 kB' 'Inactive: 3587012 kB' 'Active(anon): 11514700 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14897024 kB' 'Mapped: 111280 kB' 'AnonPages: 469400 kB' 'Shmem: 11048504 kB' 'KernelStack: 9048 kB' 'PageTables: 5232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 514792 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 226712 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.091 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.091 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@33 -- # echo 0 00:05:07.092 11:59:19 -- setup/common.sh@33 -- # return 0 00:05:07.092 11:59:19 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:07.092 11:59:19 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:07.092 11:59:19 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:07.092 11:59:19 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:07.092 11:59:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.092 11:59:19 -- setup/common.sh@18 -- # local node=1 00:05:07.092 11:59:19 -- setup/common.sh@19 -- # local var val 00:05:07.092 11:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:05:07.092 11:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.092 11:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:07.092 11:59:19 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:07.092 11:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.092 11:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223636 kB' 'MemFree: 39242208 kB' 'MemUsed: 4981428 kB' 'SwapCached: 0 kB' 'Active: 2777976 kB' 'Inactive: 133128 kB' 'Active(anon): 2492264 kB' 'Inactive(anon): 0 kB' 'Active(file): 285712 kB' 'Inactive(file): 133128 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2765900 kB' 'Mapped: 77424 kB' 'AnonPages: 145408 kB' 'Shmem: 2347060 kB' 'KernelStack: 7272 kB' 'PageTables: 3492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 184696 kB' 'Slab: 366296 kB' 'SReclaimable: 184696 kB' 'SUnreclaim: 181600 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.092 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.092 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # continue 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:05:07.093 11:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:05:07.093 11:59:19 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.093 11:59:19 -- setup/common.sh@33 -- # echo 0 00:05:07.093 11:59:19 -- setup/common.sh@33 -- # return 0 00:05:07.093 11:59:19 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:07.093 11:59:19 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:07.093 11:59:19 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:07.093 11:59:19 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:07.093 11:59:19 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:07.093 node0=512 expecting 513 00:05:07.093 11:59:19 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:07.093 11:59:19 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:07.093 11:59:19 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:07.093 11:59:19 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:07.093 node1=513 expecting 512 00:05:07.093 11:59:19 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:07.093 00:05:07.093 real 0m6.059s 00:05:07.093 user 0m2.151s 00:05:07.093 sys 0m3.980s 00:05:07.093 11:59:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:07.093 11:59:19 -- common/autotest_common.sh@10 -- # set +x 00:05:07.093 ************************************ 00:05:07.093 END TEST odd_alloc 00:05:07.093 ************************************ 00:05:07.093 11:59:19 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:07.093 11:59:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:07.093 11:59:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:07.093 11:59:19 -- common/autotest_common.sh@10 -- # set +x 00:05:07.093 ************************************ 00:05:07.093 START TEST custom_alloc 00:05:07.093 ************************************ 00:05:07.093 11:59:19 -- common/autotest_common.sh@1104 -- # custom_alloc 00:05:07.093 11:59:19 -- setup/hugepages.sh@167 -- # local IFS=, 00:05:07.093 11:59:19 -- setup/hugepages.sh@169 -- # local node 00:05:07.093 11:59:19 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:07.093 11:59:19 -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:07.093 11:59:19 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:07.093 11:59:19 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:07.093 11:59:19 -- setup/hugepages.sh@49 -- # local size=1048576 00:05:07.093 11:59:19 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:07.093 11:59:19 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:07.093 11:59:19 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:07.093 11:59:19 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:07.093 11:59:19 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:07.093 11:59:19 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:07.094 11:59:19 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:07.094 11:59:19 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:07.094 11:59:19 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:07.094 11:59:19 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:07.094 11:59:19 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:07.094 11:59:19 -- setup/hugepages.sh@83 -- # : 256 00:05:07.094 11:59:19 -- setup/hugepages.sh@84 -- # : 1 00:05:07.094 11:59:19 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:07.094 11:59:19 -- setup/hugepages.sh@83 -- # : 0 00:05:07.094 11:59:19 -- setup/hugepages.sh@84 -- # : 0 00:05:07.094 11:59:19 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:07.094 11:59:19 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:07.094 11:59:19 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:07.094 11:59:19 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:07.094 11:59:19 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:07.094 11:59:19 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:07.094 11:59:19 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:07.094 11:59:19 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:07.094 11:59:19 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:07.094 11:59:19 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:07.094 11:59:19 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:07.094 11:59:19 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:07.094 11:59:19 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:07.094 11:59:19 -- setup/hugepages.sh@78 -- # return 0 00:05:07.094 11:59:19 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:07.094 11:59:19 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:07.094 11:59:19 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:07.094 11:59:19 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:07.094 11:59:19 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:07.094 11:59:19 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:07.094 11:59:19 -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:07.094 11:59:19 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:07.094 11:59:19 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:07.094 11:59:19 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:07.094 11:59:19 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:07.094 11:59:19 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:07.094 11:59:19 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:07.094 11:59:19 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:07.094 11:59:19 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:07.094 11:59:19 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:07.094 11:59:19 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:07.094 11:59:19 -- setup/hugepages.sh@78 -- # return 0 00:05:07.094 11:59:19 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:07.094 11:59:19 -- setup/hugepages.sh@187 -- # setup output 00:05:07.094 11:59:19 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:07.094 11:59:19 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:11.288 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:11.288 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:11.288 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:12.666 11:59:25 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:12.666 11:59:25 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:12.666 11:59:25 -- setup/hugepages.sh@89 -- # local node 00:05:12.666 11:59:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:12.666 11:59:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:12.666 11:59:25 -- setup/hugepages.sh@92 -- # local surp 00:05:12.666 11:59:25 -- setup/hugepages.sh@93 -- # local resv 00:05:12.666 11:59:25 -- setup/hugepages.sh@94 -- # local anon 00:05:12.666 11:59:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:12.666 11:59:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:12.666 11:59:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:12.666 11:59:25 -- setup/common.sh@18 -- # local node= 00:05:12.666 11:59:25 -- setup/common.sh@19 -- # local var val 00:05:12.666 11:59:25 -- setup/common.sh@20 -- # local mem_f mem 00:05:12.666 11:59:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.666 11:59:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:12.666 11:59:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:12.666 11:59:25 -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.666 11:59:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.929 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.929 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.929 11:59:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 68046900 kB' 'MemAvailable: 71951060 kB' 'Buffers: 8112 kB' 'Cached: 17654944 kB' 'SwapCached: 0 kB' 'Active: 14554824 kB' 'Inactive: 3720140 kB' 'Active(anon): 14007604 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614684 kB' 'Mapped: 188852 kB' 'Shmem: 13395696 kB' 'KReclaimable: 472768 kB' 'Slab: 881944 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 409176 kB' 'KernelStack: 16640 kB' 'PageTables: 9712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962496 kB' 'Committed_AS: 15472364 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214440 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:12.929 11:59:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.929 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.929 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.929 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.929 11:59:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.929 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.930 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.930 11:59:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:12.931 11:59:25 -- setup/common.sh@33 -- # echo 0 00:05:12.931 11:59:25 -- setup/common.sh@33 -- # return 0 00:05:12.931 11:59:25 -- setup/hugepages.sh@97 -- # anon=0 00:05:12.931 11:59:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:12.931 11:59:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:12.931 11:59:25 -- setup/common.sh@18 -- # local node= 00:05:12.931 11:59:25 -- setup/common.sh@19 -- # local var val 00:05:12.931 11:59:25 -- setup/common.sh@20 -- # local mem_f mem 00:05:12.931 11:59:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.931 11:59:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:12.931 11:59:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:12.931 11:59:25 -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.931 11:59:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 68051484 kB' 'MemAvailable: 71955644 kB' 'Buffers: 8112 kB' 'Cached: 17654948 kB' 'SwapCached: 0 kB' 'Active: 14554180 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006960 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614456 kB' 'Mapped: 188776 kB' 'Shmem: 13395700 kB' 'KReclaimable: 472768 kB' 'Slab: 881736 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408968 kB' 'KernelStack: 16416 kB' 'PageTables: 8780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962496 kB' 'Committed_AS: 15473772 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214344 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.931 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.931 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.932 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.932 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.933 11:59:25 -- setup/common.sh@33 -- # echo 0 00:05:12.933 11:59:25 -- setup/common.sh@33 -- # return 0 00:05:12.933 11:59:25 -- setup/hugepages.sh@99 -- # surp=0 00:05:12.933 11:59:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:12.933 11:59:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:12.933 11:59:25 -- setup/common.sh@18 -- # local node= 00:05:12.933 11:59:25 -- setup/common.sh@19 -- # local var val 00:05:12.933 11:59:25 -- setup/common.sh@20 -- # local mem_f mem 00:05:12.933 11:59:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.933 11:59:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:12.933 11:59:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:12.933 11:59:25 -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.933 11:59:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 68052884 kB' 'MemAvailable: 71957044 kB' 'Buffers: 8112 kB' 'Cached: 17654956 kB' 'SwapCached: 0 kB' 'Active: 14554880 kB' 'Inactive: 3720140 kB' 'Active(anon): 14007660 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615192 kB' 'Mapped: 188776 kB' 'Shmem: 13395708 kB' 'KReclaimable: 472768 kB' 'Slab: 881736 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408968 kB' 'KernelStack: 16448 kB' 'PageTables: 8996 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962496 kB' 'Committed_AS: 15473788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214392 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.933 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.933 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.934 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.934 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:12.934 11:59:25 -- setup/common.sh@33 -- # echo 0 00:05:12.935 11:59:25 -- setup/common.sh@33 -- # return 0 00:05:12.935 11:59:25 -- setup/hugepages.sh@100 -- # resv=0 00:05:12.935 11:59:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:12.935 nr_hugepages=1536 00:05:12.935 11:59:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:12.935 resv_hugepages=0 00:05:12.935 11:59:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:12.935 surplus_hugepages=0 00:05:12.935 11:59:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:12.935 anon_hugepages=0 00:05:12.935 11:59:25 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:12.935 11:59:25 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:12.935 11:59:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:12.935 11:59:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:12.935 11:59:25 -- setup/common.sh@18 -- # local node= 00:05:12.935 11:59:25 -- setup/common.sh@19 -- # local var val 00:05:12.935 11:59:25 -- setup/common.sh@20 -- # local mem_f mem 00:05:12.935 11:59:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.935 11:59:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:12.935 11:59:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:12.935 11:59:25 -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.935 11:59:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 68050084 kB' 'MemAvailable: 71954244 kB' 'Buffers: 8112 kB' 'Cached: 17654972 kB' 'SwapCached: 0 kB' 'Active: 14554516 kB' 'Inactive: 3720140 kB' 'Active(anon): 14007296 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614740 kB' 'Mapped: 188776 kB' 'Shmem: 13395724 kB' 'KReclaimable: 472768 kB' 'Slab: 881608 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408840 kB' 'KernelStack: 16544 kB' 'PageTables: 9140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962496 kB' 'Committed_AS: 15473800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214408 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.935 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.935 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.936 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.936 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:12.937 11:59:25 -- setup/common.sh@33 -- # echo 1536 00:05:12.937 11:59:25 -- setup/common.sh@33 -- # return 0 00:05:12.937 11:59:25 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:12.937 11:59:25 -- setup/hugepages.sh@112 -- # get_nodes 00:05:12.937 11:59:25 -- setup/hugepages.sh@27 -- # local node 00:05:12.937 11:59:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:12.937 11:59:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:12.937 11:59:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:12.937 11:59:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:12.937 11:59:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:12.937 11:59:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:12.937 11:59:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:12.937 11:59:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:12.937 11:59:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:12.937 11:59:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:12.937 11:59:25 -- setup/common.sh@18 -- # local node=0 00:05:12.937 11:59:25 -- setup/common.sh@19 -- # local var val 00:05:12.937 11:59:25 -- setup/common.sh@20 -- # local mem_f mem 00:05:12.937 11:59:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.937 11:59:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:12.937 11:59:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:12.937 11:59:25 -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.937 11:59:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 29857940 kB' 'MemUsed: 18211936 kB' 'SwapCached: 0 kB' 'Active: 11775048 kB' 'Inactive: 3587012 kB' 'Active(anon): 11513540 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14897156 kB' 'Mapped: 111284 kB' 'AnonPages: 468028 kB' 'Shmem: 11048636 kB' 'KernelStack: 9032 kB' 'PageTables: 5184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 514980 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 226900 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.937 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.937 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@33 -- # echo 0 00:05:12.938 11:59:25 -- setup/common.sh@33 -- # return 0 00:05:12.938 11:59:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:12.938 11:59:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:12.938 11:59:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:12.938 11:59:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:12.938 11:59:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:12.938 11:59:25 -- setup/common.sh@18 -- # local node=1 00:05:12.938 11:59:25 -- setup/common.sh@19 -- # local var val 00:05:12.938 11:59:25 -- setup/common.sh@20 -- # local mem_f mem 00:05:12.938 11:59:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:12.938 11:59:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:12.938 11:59:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:12.938 11:59:25 -- setup/common.sh@28 -- # mapfile -t mem 00:05:12.938 11:59:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.938 11:59:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223636 kB' 'MemFree: 38192856 kB' 'MemUsed: 6030780 kB' 'SwapCached: 0 kB' 'Active: 2778508 kB' 'Inactive: 133128 kB' 'Active(anon): 2492796 kB' 'Inactive(anon): 0 kB' 'Active(file): 285712 kB' 'Inactive(file): 133128 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2765944 kB' 'Mapped: 77492 kB' 'AnonPages: 145816 kB' 'Shmem: 2347104 kB' 'KernelStack: 7272 kB' 'PageTables: 3500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 184688 kB' 'Slab: 366472 kB' 'SReclaimable: 184688 kB' 'SUnreclaim: 181784 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.938 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.938 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.939 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.939 11:59:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # continue 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # IFS=': ' 00:05:12.940 11:59:25 -- setup/common.sh@31 -- # read -r var val _ 00:05:12.940 11:59:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:12.940 11:59:25 -- setup/common.sh@33 -- # echo 0 00:05:12.940 11:59:25 -- setup/common.sh@33 -- # return 0 00:05:12.940 11:59:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:12.940 11:59:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:12.940 11:59:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:12.940 11:59:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:12.940 11:59:25 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:12.940 node0=512 expecting 512 00:05:12.940 11:59:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:12.940 11:59:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:12.940 11:59:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:12.940 11:59:25 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:12.940 node1=1024 expecting 1024 00:05:12.940 11:59:25 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:12.940 00:05:12.940 real 0m5.974s 00:05:12.940 user 0m2.067s 00:05:12.940 sys 0m3.931s 00:05:12.940 11:59:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:12.940 11:59:25 -- common/autotest_common.sh@10 -- # set +x 00:05:12.940 ************************************ 00:05:12.940 END TEST custom_alloc 00:05:12.940 ************************************ 00:05:12.940 11:59:25 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:12.940 11:59:25 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:12.940 11:59:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:12.940 11:59:25 -- common/autotest_common.sh@10 -- # set +x 00:05:12.940 ************************************ 00:05:12.940 START TEST no_shrink_alloc 00:05:12.940 ************************************ 00:05:12.940 11:59:25 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:05:12.940 11:59:25 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:12.940 11:59:25 -- setup/hugepages.sh@49 -- # local size=2097152 00:05:12.940 11:59:25 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:12.940 11:59:25 -- setup/hugepages.sh@51 -- # shift 00:05:12.940 11:59:25 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:12.940 11:59:25 -- setup/hugepages.sh@52 -- # local node_ids 00:05:12.940 11:59:25 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:12.940 11:59:25 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:12.940 11:59:25 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:12.940 11:59:25 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:12.940 11:59:25 -- setup/hugepages.sh@62 -- # local user_nodes 00:05:12.940 11:59:25 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:12.940 11:59:25 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:12.940 11:59:25 -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:12.940 11:59:25 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:12.940 11:59:25 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:12.940 11:59:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:12.940 11:59:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:12.940 11:59:25 -- setup/hugepages.sh@73 -- # return 0 00:05:12.940 11:59:25 -- setup/hugepages.sh@198 -- # setup output 00:05:12.940 11:59:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:12.940 11:59:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:17.223 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:17.223 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.223 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:19.209 11:59:31 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:19.209 11:59:31 -- setup/hugepages.sh@89 -- # local node 00:05:19.209 11:59:31 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:19.209 11:59:31 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:19.209 11:59:31 -- setup/hugepages.sh@92 -- # local surp 00:05:19.209 11:59:31 -- setup/hugepages.sh@93 -- # local resv 00:05:19.209 11:59:31 -- setup/hugepages.sh@94 -- # local anon 00:05:19.209 11:59:31 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:19.209 11:59:31 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:19.209 11:59:31 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:19.209 11:59:31 -- setup/common.sh@18 -- # local node= 00:05:19.209 11:59:31 -- setup/common.sh@19 -- # local var val 00:05:19.209 11:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:05:19.209 11:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.209 11:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.209 11:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.209 11:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.209 11:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69101796 kB' 'MemAvailable: 73005956 kB' 'Buffers: 8112 kB' 'Cached: 17655100 kB' 'SwapCached: 0 kB' 'Active: 14554620 kB' 'Inactive: 3720140 kB' 'Active(anon): 14007400 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 615112 kB' 'Mapped: 188844 kB' 'Shmem: 13395852 kB' 'KReclaimable: 472768 kB' 'Slab: 881272 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408504 kB' 'KernelStack: 16304 kB' 'PageTables: 8700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15470384 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214248 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.209 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.209 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:19.210 11:59:31 -- setup/common.sh@33 -- # echo 0 00:05:19.210 11:59:31 -- setup/common.sh@33 -- # return 0 00:05:19.210 11:59:31 -- setup/hugepages.sh@97 -- # anon=0 00:05:19.210 11:59:31 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:19.210 11:59:31 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:19.210 11:59:31 -- setup/common.sh@18 -- # local node= 00:05:19.210 11:59:31 -- setup/common.sh@19 -- # local var val 00:05:19.210 11:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:05:19.210 11:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.210 11:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.210 11:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.210 11:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.210 11:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69102948 kB' 'MemAvailable: 73007108 kB' 'Buffers: 8112 kB' 'Cached: 17655104 kB' 'SwapCached: 0 kB' 'Active: 14553968 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006748 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614432 kB' 'Mapped: 188844 kB' 'Shmem: 13395856 kB' 'KReclaimable: 472768 kB' 'Slab: 881312 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408544 kB' 'KernelStack: 16304 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15470396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214216 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.210 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.210 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.211 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.211 11:59:31 -- setup/common.sh@33 -- # echo 0 00:05:19.211 11:59:31 -- setup/common.sh@33 -- # return 0 00:05:19.211 11:59:31 -- setup/hugepages.sh@99 -- # surp=0 00:05:19.211 11:59:31 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:19.211 11:59:31 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:19.211 11:59:31 -- setup/common.sh@18 -- # local node= 00:05:19.211 11:59:31 -- setup/common.sh@19 -- # local var val 00:05:19.211 11:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:05:19.211 11:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.211 11:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.211 11:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.211 11:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.211 11:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.211 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69101900 kB' 'MemAvailable: 73006060 kB' 'Buffers: 8112 kB' 'Cached: 17655116 kB' 'SwapCached: 0 kB' 'Active: 14553976 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006756 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614428 kB' 'Mapped: 188844 kB' 'Shmem: 13395868 kB' 'KReclaimable: 472768 kB' 'Slab: 881312 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408544 kB' 'KernelStack: 16304 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15470412 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214216 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.212 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.212 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:19.213 11:59:31 -- setup/common.sh@33 -- # echo 0 00:05:19.213 11:59:31 -- setup/common.sh@33 -- # return 0 00:05:19.213 11:59:31 -- setup/hugepages.sh@100 -- # resv=0 00:05:19.213 11:59:31 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:19.213 nr_hugepages=1024 00:05:19.213 11:59:31 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:19.213 resv_hugepages=0 00:05:19.213 11:59:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:19.213 surplus_hugepages=0 00:05:19.213 11:59:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:19.213 anon_hugepages=0 00:05:19.213 11:59:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:19.213 11:59:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:19.213 11:59:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:19.213 11:59:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:19.213 11:59:32 -- setup/common.sh@18 -- # local node= 00:05:19.213 11:59:32 -- setup/common.sh@19 -- # local var val 00:05:19.213 11:59:32 -- setup/common.sh@20 -- # local mem_f mem 00:05:19.213 11:59:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.213 11:59:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:19.213 11:59:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:19.213 11:59:32 -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.213 11:59:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69101332 kB' 'MemAvailable: 73005492 kB' 'Buffers: 8112 kB' 'Cached: 17655128 kB' 'SwapCached: 0 kB' 'Active: 14553984 kB' 'Inactive: 3720140 kB' 'Active(anon): 14006764 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 614428 kB' 'Mapped: 188844 kB' 'Shmem: 13395880 kB' 'KReclaimable: 472768 kB' 'Slab: 881312 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408544 kB' 'KernelStack: 16304 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15470428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214216 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.213 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.213 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.214 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.214 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:19.215 11:59:32 -- setup/common.sh@33 -- # echo 1024 00:05:19.215 11:59:32 -- setup/common.sh@33 -- # return 0 00:05:19.215 11:59:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:19.215 11:59:32 -- setup/hugepages.sh@112 -- # get_nodes 00:05:19.215 11:59:32 -- setup/hugepages.sh@27 -- # local node 00:05:19.215 11:59:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:19.215 11:59:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:19.215 11:59:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:19.215 11:59:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:19.215 11:59:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:19.215 11:59:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:19.215 11:59:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:19.215 11:59:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:19.215 11:59:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:19.215 11:59:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:19.215 11:59:32 -- setup/common.sh@18 -- # local node=0 00:05:19.215 11:59:32 -- setup/common.sh@19 -- # local var val 00:05:19.215 11:59:32 -- setup/common.sh@20 -- # local mem_f mem 00:05:19.215 11:59:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:19.215 11:59:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:19.215 11:59:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:19.215 11:59:32 -- setup/common.sh@28 -- # mapfile -t mem 00:05:19.215 11:59:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 28820864 kB' 'MemUsed: 19249012 kB' 'SwapCached: 0 kB' 'Active: 11775284 kB' 'Inactive: 3587012 kB' 'Active(anon): 11513776 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14897248 kB' 'Mapped: 111280 kB' 'AnonPages: 468372 kB' 'Shmem: 11048728 kB' 'KernelStack: 9048 kB' 'PageTables: 5232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 514920 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 226840 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.215 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.215 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # continue 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # IFS=': ' 00:05:19.216 11:59:32 -- setup/common.sh@31 -- # read -r var val _ 00:05:19.216 11:59:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:19.216 11:59:32 -- setup/common.sh@33 -- # echo 0 00:05:19.216 11:59:32 -- setup/common.sh@33 -- # return 0 00:05:19.216 11:59:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:19.216 11:59:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:19.216 11:59:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:19.216 11:59:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:19.216 11:59:32 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:19.216 node0=1024 expecting 1024 00:05:19.216 11:59:32 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:19.216 11:59:32 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:19.216 11:59:32 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:19.216 11:59:32 -- setup/hugepages.sh@202 -- # setup output 00:05:19.216 11:59:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:19.216 11:59:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:23.412 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:23.412 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:23.412 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:23.413 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:23.413 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:25.323 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:25.323 11:59:37 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:25.323 11:59:37 -- setup/hugepages.sh@89 -- # local node 00:05:25.323 11:59:37 -- setup/hugepages.sh@90 -- # local sorted_t 00:05:25.323 11:59:37 -- setup/hugepages.sh@91 -- # local sorted_s 00:05:25.323 11:59:37 -- setup/hugepages.sh@92 -- # local surp 00:05:25.323 11:59:37 -- setup/hugepages.sh@93 -- # local resv 00:05:25.323 11:59:37 -- setup/hugepages.sh@94 -- # local anon 00:05:25.323 11:59:37 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:25.323 11:59:37 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:25.323 11:59:37 -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:25.323 11:59:37 -- setup/common.sh@18 -- # local node= 00:05:25.323 11:59:37 -- setup/common.sh@19 -- # local var val 00:05:25.323 11:59:37 -- setup/common.sh@20 -- # local mem_f mem 00:05:25.323 11:59:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.323 11:59:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.323 11:59:37 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.323 11:59:37 -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.323 11:59:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69108824 kB' 'MemAvailable: 73012984 kB' 'Buffers: 8112 kB' 'Cached: 17655256 kB' 'SwapCached: 0 kB' 'Active: 14557060 kB' 'Inactive: 3720140 kB' 'Active(anon): 14009840 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617252 kB' 'Mapped: 188964 kB' 'Shmem: 13396008 kB' 'KReclaimable: 472768 kB' 'Slab: 881196 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408428 kB' 'KernelStack: 16624 kB' 'PageTables: 9484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15475360 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214408 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.323 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.323 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # continue 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:37 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.324 11:59:37 -- setup/common.sh@33 -- # echo 0 00:05:25.324 11:59:37 -- setup/common.sh@33 -- # return 0 00:05:25.324 11:59:37 -- setup/hugepages.sh@97 -- # anon=0 00:05:25.324 11:59:37 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:25.324 11:59:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.324 11:59:38 -- setup/common.sh@18 -- # local node= 00:05:25.324 11:59:38 -- setup/common.sh@19 -- # local var val 00:05:25.324 11:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:05:25.324 11:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.324 11:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.324 11:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.324 11:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.324 11:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.324 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.324 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.324 11:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69107420 kB' 'MemAvailable: 73011580 kB' 'Buffers: 8112 kB' 'Cached: 17655256 kB' 'SwapCached: 0 kB' 'Active: 14557008 kB' 'Inactive: 3720140 kB' 'Active(anon): 14009788 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617092 kB' 'Mapped: 188888 kB' 'Shmem: 13396008 kB' 'KReclaimable: 472768 kB' 'Slab: 881212 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408444 kB' 'KernelStack: 16592 kB' 'PageTables: 9536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15474104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214328 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:25.324 11:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.324 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.325 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.325 11:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.326 11:59:38 -- setup/common.sh@33 -- # echo 0 00:05:25.326 11:59:38 -- setup/common.sh@33 -- # return 0 00:05:25.326 11:59:38 -- setup/hugepages.sh@99 -- # surp=0 00:05:25.326 11:59:38 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:25.326 11:59:38 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:25.326 11:59:38 -- setup/common.sh@18 -- # local node= 00:05:25.326 11:59:38 -- setup/common.sh@19 -- # local var val 00:05:25.326 11:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:05:25.326 11:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.326 11:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.326 11:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.326 11:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.326 11:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69108732 kB' 'MemAvailable: 73012892 kB' 'Buffers: 8112 kB' 'Cached: 17655256 kB' 'SwapCached: 0 kB' 'Active: 14556816 kB' 'Inactive: 3720140 kB' 'Active(anon): 14009596 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 616884 kB' 'Mapped: 188900 kB' 'Shmem: 13396008 kB' 'KReclaimable: 472768 kB' 'Slab: 881204 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408436 kB' 'KernelStack: 16560 kB' 'PageTables: 10048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15475624 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214360 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.326 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.326 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.327 11:59:38 -- setup/common.sh@33 -- # echo 0 00:05:25.327 11:59:38 -- setup/common.sh@33 -- # return 0 00:05:25.327 11:59:38 -- setup/hugepages.sh@100 -- # resv=0 00:05:25.327 11:59:38 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:25.327 nr_hugepages=1024 00:05:25.327 11:59:38 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:25.327 resv_hugepages=0 00:05:25.327 11:59:38 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:25.327 surplus_hugepages=0 00:05:25.327 11:59:38 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:25.327 anon_hugepages=0 00:05:25.327 11:59:38 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:25.327 11:59:38 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:25.327 11:59:38 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:25.327 11:59:38 -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:25.327 11:59:38 -- setup/common.sh@18 -- # local node= 00:05:25.327 11:59:38 -- setup/common.sh@19 -- # local var val 00:05:25.327 11:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:05:25.327 11:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.327 11:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.327 11:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.327 11:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.327 11:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293512 kB' 'MemFree: 69107796 kB' 'MemAvailable: 73011956 kB' 'Buffers: 8112 kB' 'Cached: 17655280 kB' 'SwapCached: 0 kB' 'Active: 14556928 kB' 'Inactive: 3720140 kB' 'Active(anon): 14009708 kB' 'Inactive(anon): 0 kB' 'Active(file): 547220 kB' 'Inactive(file): 3720140 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 617044 kB' 'Mapped: 188892 kB' 'Shmem: 13396032 kB' 'KReclaimable: 472768 kB' 'Slab: 881044 kB' 'SReclaimable: 472768 kB' 'SUnreclaim: 408276 kB' 'KernelStack: 16720 kB' 'PageTables: 10004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486784 kB' 'Committed_AS: 15472264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214344 kB' 'VmallocChunk: 0 kB' 'Percpu: 69120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1592732 kB' 'DirectMap2M: 41074688 kB' 'DirectMap1G: 58720256 kB' 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.327 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.327 11:59:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.328 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.328 11:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.329 11:59:38 -- setup/common.sh@33 -- # echo 1024 00:05:25.329 11:59:38 -- setup/common.sh@33 -- # return 0 00:05:25.329 11:59:38 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:25.329 11:59:38 -- setup/hugepages.sh@112 -- # get_nodes 00:05:25.329 11:59:38 -- setup/hugepages.sh@27 -- # local node 00:05:25.329 11:59:38 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.329 11:59:38 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:25.329 11:59:38 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.329 11:59:38 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:25.329 11:59:38 -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:25.329 11:59:38 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:25.329 11:59:38 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:25.329 11:59:38 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:25.329 11:59:38 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:25.329 11:59:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.329 11:59:38 -- setup/common.sh@18 -- # local node=0 00:05:25.329 11:59:38 -- setup/common.sh@19 -- # local var val 00:05:25.329 11:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:05:25.329 11:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.329 11:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:25.329 11:59:38 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:25.329 11:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.329 11:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069876 kB' 'MemFree: 28803752 kB' 'MemUsed: 19266124 kB' 'SwapCached: 0 kB' 'Active: 11775220 kB' 'Inactive: 3587012 kB' 'Active(anon): 11513712 kB' 'Inactive(anon): 0 kB' 'Active(file): 261508 kB' 'Inactive(file): 3587012 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 14897368 kB' 'Mapped: 111280 kB' 'AnonPages: 467996 kB' 'Shmem: 11048848 kB' 'KernelStack: 9016 kB' 'PageTables: 5136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 288080 kB' 'Slab: 514408 kB' 'SReclaimable: 288080 kB' 'SUnreclaim: 226328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.329 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.329 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # continue 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:05:25.330 11:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:05:25.330 11:59:38 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.330 11:59:38 -- setup/common.sh@33 -- # echo 0 00:05:25.330 11:59:38 -- setup/common.sh@33 -- # return 0 00:05:25.330 11:59:38 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:25.330 11:59:38 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:25.330 11:59:38 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:25.330 11:59:38 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:25.330 11:59:38 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:25.330 node0=1024 expecting 1024 00:05:25.330 11:59:38 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:25.330 00:05:25.330 real 0m12.199s 00:05:25.330 user 0m4.335s 00:05:25.330 sys 0m7.889s 00:05:25.330 11:59:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.330 11:59:38 -- common/autotest_common.sh@10 -- # set +x 00:05:25.330 ************************************ 00:05:25.330 END TEST no_shrink_alloc 00:05:25.330 ************************************ 00:05:25.330 11:59:38 -- setup/hugepages.sh@217 -- # clear_hp 00:05:25.330 11:59:38 -- setup/hugepages.sh@37 -- # local node hp 00:05:25.330 11:59:38 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:25.330 11:59:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.330 11:59:38 -- setup/hugepages.sh@41 -- # echo 0 00:05:25.330 11:59:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.330 11:59:38 -- setup/hugepages.sh@41 -- # echo 0 00:05:25.330 11:59:38 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:25.330 11:59:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.330 11:59:38 -- setup/hugepages.sh@41 -- # echo 0 00:05:25.330 11:59:38 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:25.330 11:59:38 -- setup/hugepages.sh@41 -- # echo 0 00:05:25.330 11:59:38 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:25.330 11:59:38 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:25.330 00:05:25.330 real 0m46.055s 00:05:25.330 user 0m14.924s 00:05:25.330 sys 0m28.224s 00:05:25.330 11:59:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.330 11:59:38 -- common/autotest_common.sh@10 -- # set +x 00:05:25.330 ************************************ 00:05:25.330 END TEST hugepages 00:05:25.330 ************************************ 00:05:25.330 11:59:38 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:25.330 11:59:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:25.330 11:59:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:25.330 11:59:38 -- common/autotest_common.sh@10 -- # set +x 00:05:25.330 ************************************ 00:05:25.330 START TEST driver 00:05:25.330 ************************************ 00:05:25.330 11:59:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:25.330 * Looking for test storage... 00:05:25.330 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:25.330 11:59:38 -- setup/driver.sh@68 -- # setup reset 00:05:25.330 11:59:38 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:25.330 11:59:38 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:33.455 11:59:45 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:33.455 11:59:45 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:33.455 11:59:45 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:33.455 11:59:45 -- common/autotest_common.sh@10 -- # set +x 00:05:33.455 ************************************ 00:05:33.455 START TEST guess_driver 00:05:33.455 ************************************ 00:05:33.455 11:59:45 -- common/autotest_common.sh@1104 -- # guess_driver 00:05:33.455 11:59:45 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:33.455 11:59:45 -- setup/driver.sh@47 -- # local fail=0 00:05:33.455 11:59:45 -- setup/driver.sh@49 -- # pick_driver 00:05:33.455 11:59:45 -- setup/driver.sh@36 -- # vfio 00:05:33.455 11:59:45 -- setup/driver.sh@21 -- # local iommu_grups 00:05:33.455 11:59:45 -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:33.455 11:59:45 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:33.455 11:59:45 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:33.455 11:59:45 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:33.455 11:59:45 -- setup/driver.sh@29 -- # (( 238 > 0 )) 00:05:33.455 11:59:45 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:33.455 11:59:45 -- setup/driver.sh@14 -- # mod vfio_pci 00:05:33.455 11:59:45 -- setup/driver.sh@12 -- # dep vfio_pci 00:05:33.455 11:59:45 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:33.455 11:59:45 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:33.455 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:33.455 11:59:45 -- setup/driver.sh@30 -- # return 0 00:05:33.455 11:59:45 -- setup/driver.sh@37 -- # echo vfio-pci 00:05:33.455 11:59:45 -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:33.455 11:59:45 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:33.455 11:59:45 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:33.455 Looking for driver=vfio-pci 00:05:33.455 11:59:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:33.455 11:59:45 -- setup/driver.sh@45 -- # setup output config 00:05:33.455 11:59:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:33.455 11:59:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:36.750 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:36.750 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:36.750 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.009 11:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.009 11:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.009 11:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:40.310 11:59:52 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:40.310 11:59:52 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:40.310 11:59:52 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:42.212 11:59:54 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:42.212 11:59:54 -- setup/driver.sh@65 -- # setup reset 00:05:42.212 11:59:54 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:42.212 11:59:54 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:48.780 00:05:48.780 real 0m15.950s 00:05:48.780 user 0m3.779s 00:05:48.780 sys 0m8.199s 00:05:48.780 12:00:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.780 12:00:01 -- common/autotest_common.sh@10 -- # set +x 00:05:48.780 ************************************ 00:05:48.780 END TEST guess_driver 00:05:48.780 ************************************ 00:05:48.780 00:05:48.780 real 0m23.345s 00:05:48.780 user 0m5.978s 00:05:48.780 sys 0m12.516s 00:05:48.780 12:00:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.780 12:00:01 -- common/autotest_common.sh@10 -- # set +x 00:05:48.780 ************************************ 00:05:48.780 END TEST driver 00:05:48.780 ************************************ 00:05:48.780 12:00:01 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:48.780 12:00:01 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.780 12:00:01 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.780 12:00:01 -- common/autotest_common.sh@10 -- # set +x 00:05:48.780 ************************************ 00:05:48.780 START TEST devices 00:05:48.780 ************************************ 00:05:48.780 12:00:01 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:48.780 * Looking for test storage... 00:05:48.780 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:48.780 12:00:01 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:48.780 12:00:01 -- setup/devices.sh@192 -- # setup reset 00:05:48.780 12:00:01 -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:48.780 12:00:01 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:55.348 12:00:08 -- setup/devices.sh@194 -- # get_zoned_devs 00:05:55.348 12:00:08 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:05:55.348 12:00:08 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:05:55.348 12:00:08 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:05:55.348 12:00:08 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:05:55.348 12:00:08 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:05:55.348 12:00:08 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:05:55.348 12:00:08 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:55.348 12:00:08 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:05:55.348 12:00:08 -- setup/devices.sh@196 -- # blocks=() 00:05:55.348 12:00:08 -- setup/devices.sh@196 -- # declare -a blocks 00:05:55.348 12:00:08 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:55.348 12:00:08 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:55.348 12:00:08 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:55.348 12:00:08 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:55.348 12:00:08 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:55.348 12:00:08 -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:55.348 12:00:08 -- setup/devices.sh@202 -- # pci=0000:1a:00.0 00:05:55.348 12:00:08 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:05:55.348 12:00:08 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:55.348 12:00:08 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:05:55.348 12:00:08 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:55.348 No valid GPT data, bailing 00:05:55.348 12:00:08 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:55.348 12:00:08 -- scripts/common.sh@393 -- # pt= 00:05:55.348 12:00:08 -- scripts/common.sh@394 -- # return 1 00:05:55.348 12:00:08 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:55.348 12:00:08 -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:55.348 12:00:08 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:55.348 12:00:08 -- setup/common.sh@80 -- # echo 4000787030016 00:05:55.348 12:00:08 -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:05:55.348 12:00:08 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:55.348 12:00:08 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:1a:00.0 00:05:55.348 12:00:08 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:55.348 12:00:08 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:55.348 12:00:08 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:55.348 12:00:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:55.348 12:00:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:55.348 12:00:08 -- common/autotest_common.sh@10 -- # set +x 00:05:55.348 ************************************ 00:05:55.348 START TEST nvme_mount 00:05:55.348 ************************************ 00:05:55.348 12:00:08 -- common/autotest_common.sh@1104 -- # nvme_mount 00:05:55.348 12:00:08 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:55.348 12:00:08 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:55.348 12:00:08 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:55.348 12:00:08 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:55.348 12:00:08 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:55.348 12:00:08 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:55.348 12:00:08 -- setup/common.sh@40 -- # local part_no=1 00:05:55.348 12:00:08 -- setup/common.sh@41 -- # local size=1073741824 00:05:55.348 12:00:08 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:55.348 12:00:08 -- setup/common.sh@44 -- # parts=() 00:05:55.348 12:00:08 -- setup/common.sh@44 -- # local parts 00:05:55.348 12:00:08 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:55.348 12:00:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:55.348 12:00:08 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:55.348 12:00:08 -- setup/common.sh@46 -- # (( part++ )) 00:05:55.348 12:00:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:55.348 12:00:08 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:55.348 12:00:08 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:55.348 12:00:08 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:56.331 Creating new GPT entries in memory. 00:05:56.331 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:56.331 other utilities. 00:05:56.331 12:00:09 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:56.331 12:00:09 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:56.331 12:00:09 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:56.331 12:00:09 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:56.331 12:00:09 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:57.711 Creating new GPT entries in memory. 00:05:57.711 The operation has completed successfully. 00:05:57.711 12:00:10 -- setup/common.sh@57 -- # (( part++ )) 00:05:57.711 12:00:10 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:57.711 12:00:10 -- setup/common.sh@62 -- # wait 2653240 00:05:57.711 12:00:10 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.711 12:00:10 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:57.711 12:00:10 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.711 12:00:10 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:57.711 12:00:10 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:57.711 12:00:10 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.711 12:00:10 -- setup/devices.sh@105 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:57.711 12:00:10 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:05:57.711 12:00:10 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:57.711 12:00:10 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:57.711 12:00:10 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:57.711 12:00:10 -- setup/devices.sh@53 -- # local found=0 00:05:57.711 12:00:10 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:57.711 12:00:10 -- setup/devices.sh@56 -- # : 00:05:57.711 12:00:10 -- setup/devices.sh@59 -- # local pci status 00:05:57.711 12:00:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:57.711 12:00:10 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:05:57.711 12:00:10 -- setup/devices.sh@47 -- # setup output config 00:05:57.711 12:00:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:57.711 12:00:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:01.902 12:00:14 -- setup/devices.sh@63 -- # found=1 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.902 12:00:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:01.902 12:00:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.282 12:00:16 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:03.282 12:00:16 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:03.282 12:00:16 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.283 12:00:16 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:03.283 12:00:16 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:03.283 12:00:16 -- setup/devices.sh@110 -- # cleanup_nvme 00:06:03.283 12:00:16 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.283 12:00:16 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.283 12:00:16 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:03.283 12:00:16 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:03.283 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:03.283 12:00:16 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:03.283 12:00:16 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:03.542 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:03.542 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:06:03.542 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:03.542 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:03.542 12:00:16 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:03.542 12:00:16 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:03.542 12:00:16 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.804 12:00:16 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:03.804 12:00:16 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:03.804 12:00:16 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.804 12:00:16 -- setup/devices.sh@116 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:03.804 12:00:16 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:03.804 12:00:16 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:03.804 12:00:16 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:03.804 12:00:16 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:03.804 12:00:16 -- setup/devices.sh@53 -- # local found=0 00:06:03.804 12:00:16 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:03.804 12:00:16 -- setup/devices.sh@56 -- # : 00:06:03.804 12:00:16 -- setup/devices.sh@59 -- # local pci status 00:06:03.804 12:00:16 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:03.804 12:00:16 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:03.804 12:00:16 -- setup/devices.sh@47 -- # setup output config 00:06:03.804 12:00:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:03.804 12:00:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:08.112 12:00:20 -- setup/devices.sh@63 -- # found=1 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.112 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.112 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.113 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.113 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.113 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.113 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.113 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.113 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.113 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.113 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.113 12:00:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:08.113 12:00:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.492 12:00:22 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:09.492 12:00:22 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:09.492 12:00:22 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:09.492 12:00:22 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:09.492 12:00:22 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:09.492 12:00:22 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:09.492 12:00:22 -- setup/devices.sh@125 -- # verify 0000:1a:00.0 data@nvme0n1 '' '' 00:06:09.492 12:00:22 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:09.492 12:00:22 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:09.492 12:00:22 -- setup/devices.sh@50 -- # local mount_point= 00:06:09.492 12:00:22 -- setup/devices.sh@51 -- # local test_file= 00:06:09.492 12:00:22 -- setup/devices.sh@53 -- # local found=0 00:06:09.492 12:00:22 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:09.753 12:00:22 -- setup/devices.sh@59 -- # local pci status 00:06:09.753 12:00:22 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:09.753 12:00:22 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:09.753 12:00:22 -- setup/devices.sh@47 -- # setup output config 00:06:09.753 12:00:22 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:09.753 12:00:22 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:13.041 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.041 12:00:25 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:13.041 12:00:25 -- setup/devices.sh@63 -- # found=1 00:06:13.041 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.041 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.041 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.041 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.041 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.041 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.041 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.041 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.042 12:00:25 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:13.042 12:00:25 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:15.573 12:00:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:15.573 12:00:27 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:15.573 12:00:27 -- setup/devices.sh@68 -- # return 0 00:06:15.573 12:00:27 -- setup/devices.sh@128 -- # cleanup_nvme 00:06:15.573 12:00:27 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:15.573 12:00:27 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:15.574 12:00:27 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:15.574 12:00:27 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:15.574 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:15.574 00:06:15.574 real 0m19.709s 00:06:15.574 user 0m5.749s 00:06:15.574 sys 0m11.644s 00:06:15.574 12:00:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:15.574 12:00:28 -- common/autotest_common.sh@10 -- # set +x 00:06:15.574 ************************************ 00:06:15.574 END TEST nvme_mount 00:06:15.574 ************************************ 00:06:15.574 12:00:28 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:15.574 12:00:28 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:15.574 12:00:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:15.574 12:00:28 -- common/autotest_common.sh@10 -- # set +x 00:06:15.574 ************************************ 00:06:15.574 START TEST dm_mount 00:06:15.574 ************************************ 00:06:15.574 12:00:28 -- common/autotest_common.sh@1104 -- # dm_mount 00:06:15.574 12:00:28 -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:15.574 12:00:28 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:15.574 12:00:28 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:15.574 12:00:28 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:15.574 12:00:28 -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:15.574 12:00:28 -- setup/common.sh@40 -- # local part_no=2 00:06:15.574 12:00:28 -- setup/common.sh@41 -- # local size=1073741824 00:06:15.574 12:00:28 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:15.574 12:00:28 -- setup/common.sh@44 -- # parts=() 00:06:15.574 12:00:28 -- setup/common.sh@44 -- # local parts 00:06:15.574 12:00:28 -- setup/common.sh@46 -- # (( part = 1 )) 00:06:15.574 12:00:28 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:15.574 12:00:28 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:15.574 12:00:28 -- setup/common.sh@46 -- # (( part++ )) 00:06:15.574 12:00:28 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:15.574 12:00:28 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:15.574 12:00:28 -- setup/common.sh@46 -- # (( part++ )) 00:06:15.574 12:00:28 -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:15.574 12:00:28 -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:15.574 12:00:28 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:15.574 12:00:28 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:16.141 Creating new GPT entries in memory. 00:06:16.141 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:16.141 other utilities. 00:06:16.141 12:00:29 -- setup/common.sh@57 -- # (( part = 1 )) 00:06:16.141 12:00:29 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:16.141 12:00:29 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:16.141 12:00:29 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:16.141 12:00:29 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:17.079 Creating new GPT entries in memory. 00:06:17.079 The operation has completed successfully. 00:06:17.079 12:00:30 -- setup/common.sh@57 -- # (( part++ )) 00:06:17.079 12:00:30 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:17.079 12:00:30 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:17.079 12:00:30 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:17.079 12:00:30 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:18.459 The operation has completed successfully. 00:06:18.459 12:00:31 -- setup/common.sh@57 -- # (( part++ )) 00:06:18.459 12:00:31 -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:18.459 12:00:31 -- setup/common.sh@62 -- # wait 2658545 00:06:18.459 12:00:31 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:18.459 12:00:31 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:18.459 12:00:31 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:18.459 12:00:31 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:18.459 12:00:31 -- setup/devices.sh@160 -- # for t in {1..5} 00:06:18.459 12:00:31 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:18.459 12:00:31 -- setup/devices.sh@161 -- # break 00:06:18.459 12:00:31 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:18.459 12:00:31 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:18.459 12:00:31 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:18.459 12:00:31 -- setup/devices.sh@166 -- # dm=dm-0 00:06:18.459 12:00:31 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:18.459 12:00:31 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:18.459 12:00:31 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:18.459 12:00:31 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:18.459 12:00:31 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:18.459 12:00:31 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:18.459 12:00:31 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:18.459 12:00:31 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:18.459 12:00:31 -- setup/devices.sh@174 -- # verify 0000:1a:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:18.459 12:00:31 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:18.459 12:00:31 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:18.459 12:00:31 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:18.459 12:00:31 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:18.459 12:00:31 -- setup/devices.sh@53 -- # local found=0 00:06:18.459 12:00:31 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:18.459 12:00:31 -- setup/devices.sh@56 -- # : 00:06:18.459 12:00:31 -- setup/devices.sh@59 -- # local pci status 00:06:18.459 12:00:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:18.459 12:00:31 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:18.459 12:00:31 -- setup/devices.sh@47 -- # setup output config 00:06:18.459 12:00:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:18.459 12:00:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:22.652 12:00:34 -- setup/devices.sh@63 -- # found=1 00:06:22.652 12:00:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:34 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:34 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:22.652 12:00:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:22.652 12:00:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.557 12:00:37 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:24.557 12:00:37 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:24.557 12:00:37 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:24.557 12:00:37 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:24.557 12:00:37 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:24.557 12:00:37 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:24.557 12:00:37 -- setup/devices.sh@184 -- # verify 0000:1a:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:24.557 12:00:37 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:06:24.557 12:00:37 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:24.557 12:00:37 -- setup/devices.sh@50 -- # local mount_point= 00:06:24.557 12:00:37 -- setup/devices.sh@51 -- # local test_file= 00:06:24.557 12:00:37 -- setup/devices.sh@53 -- # local found=0 00:06:24.557 12:00:37 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:24.557 12:00:37 -- setup/devices.sh@59 -- # local pci status 00:06:24.557 12:00:37 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.557 12:00:37 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:06:24.557 12:00:37 -- setup/devices.sh@47 -- # setup output config 00:06:24.557 12:00:37 -- setup/common.sh@9 -- # [[ output == output ]] 00:06:24.557 12:00:37 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:27.851 12:00:40 -- setup/devices.sh@63 -- # found=1 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.851 12:00:40 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:06:27.851 12:00:40 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:29.756 12:00:42 -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:29.757 12:00:42 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:29.757 12:00:42 -- setup/devices.sh@68 -- # return 0 00:06:29.757 12:00:42 -- setup/devices.sh@187 -- # cleanup_dm 00:06:29.757 12:00:42 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:29.757 12:00:42 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:29.757 12:00:42 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:29.757 12:00:42 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:29.757 12:00:42 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:29.757 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:29.757 12:00:42 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:29.757 12:00:42 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:29.757 00:06:29.757 real 0m14.711s 00:06:29.757 user 0m3.863s 00:06:29.757 sys 0m7.857s 00:06:29.757 12:00:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.757 12:00:42 -- common/autotest_common.sh@10 -- # set +x 00:06:29.757 ************************************ 00:06:29.757 END TEST dm_mount 00:06:29.757 ************************************ 00:06:30.016 12:00:42 -- setup/devices.sh@1 -- # cleanup 00:06:30.016 12:00:42 -- setup/devices.sh@11 -- # cleanup_nvme 00:06:30.016 12:00:42 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:30.016 12:00:42 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:30.016 12:00:42 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:30.016 12:00:42 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:30.016 12:00:42 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:30.275 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:30.275 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:06:30.275 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:30.275 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:30.275 12:00:43 -- setup/devices.sh@12 -- # cleanup_dm 00:06:30.275 12:00:43 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:30.275 12:00:43 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:30.275 12:00:43 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:30.275 12:00:43 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:30.275 12:00:43 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:30.275 12:00:43 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:30.275 00:06:30.275 real 0m41.469s 00:06:30.275 user 0m11.961s 00:06:30.275 sys 0m24.116s 00:06:30.275 12:00:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.275 12:00:43 -- common/autotest_common.sh@10 -- # set +x 00:06:30.275 ************************************ 00:06:30.275 END TEST devices 00:06:30.275 ************************************ 00:06:30.275 00:06:30.275 real 2m30.641s 00:06:30.275 user 0m45.177s 00:06:30.275 sys 1m28.777s 00:06:30.275 12:00:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.275 12:00:43 -- common/autotest_common.sh@10 -- # set +x 00:06:30.275 ************************************ 00:06:30.275 END TEST setup.sh 00:06:30.275 ************************************ 00:06:30.275 12:00:43 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:34.471 Hugepages 00:06:34.471 node hugesize free / total 00:06:34.471 node0 1048576kB 0 / 0 00:06:34.471 node0 2048kB 2048 / 2048 00:06:34.471 node1 1048576kB 0 / 0 00:06:34.471 node1 2048kB 0 / 0 00:06:34.471 00:06:34.471 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:34.471 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:34.471 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:34.471 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:06:34.471 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:34.471 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:34.471 12:00:47 -- spdk/autotest.sh@141 -- # uname -s 00:06:34.471 12:00:47 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:06:34.471 12:00:47 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:06:34.471 12:00:47 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:38.670 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:38.670 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:41.204 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:06:43.803 12:00:56 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:44.371 12:00:57 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:44.371 12:00:57 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:44.371 12:00:57 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:06:44.371 12:00:57 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:06:44.371 12:00:57 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:44.371 12:00:57 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:44.371 12:00:57 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:44.371 12:00:57 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:44.371 12:00:57 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:44.371 12:00:57 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:44.371 12:00:57 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:1a:00.0 00:06:44.371 12:00:57 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:48.565 Waiting for block devices as requested 00:06:48.566 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:06:48.566 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:48.566 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:48.566 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:48.566 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:48.566 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:48.825 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:48.825 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:48.825 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:48.825 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:49.085 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:49.085 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:49.085 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:49.343 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:49.344 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:49.344 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:49.602 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:51.508 12:01:04 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:06:51.508 12:01:04 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:1a:00.0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1487 -- # grep 0000:1a:00.0/nvme/nvme 00:06:51.508 12:01:04 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 ]] 00:06:51.508 12:01:04 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:06:51.508 12:01:04 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1530 -- # grep oacs 00:06:51.508 12:01:04 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:06:51.508 12:01:04 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:06:51.508 12:01:04 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:06:51.508 12:01:04 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:06:51.508 12:01:04 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:06:51.508 12:01:04 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:06:51.508 12:01:04 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:06:51.508 12:01:04 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:06:51.508 12:01:04 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:06:51.508 12:01:04 -- common/autotest_common.sh@1542 -- # continue 00:06:51.508 12:01:04 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:06:51.508 12:01:04 -- common/autotest_common.sh@718 -- # xtrace_disable 00:06:51.508 12:01:04 -- common/autotest_common.sh@10 -- # set +x 00:06:51.508 12:01:04 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:06:51.508 12:01:04 -- common/autotest_common.sh@712 -- # xtrace_disable 00:06:51.508 12:01:04 -- common/autotest_common.sh@10 -- # set +x 00:06:51.508 12:01:04 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:55.704 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:55.704 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:58.995 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:07:00.371 12:01:13 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:07:00.372 12:01:13 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:00.372 12:01:13 -- common/autotest_common.sh@10 -- # set +x 00:07:00.372 12:01:13 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:07:00.372 12:01:13 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:07:00.372 12:01:13 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:07:00.372 12:01:13 -- common/autotest_common.sh@1562 -- # bdfs=() 00:07:00.372 12:01:13 -- common/autotest_common.sh@1562 -- # local bdfs 00:07:00.372 12:01:13 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:07:00.372 12:01:13 -- common/autotest_common.sh@1498 -- # bdfs=() 00:07:00.372 12:01:13 -- common/autotest_common.sh@1498 -- # local bdfs 00:07:00.372 12:01:13 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:00.372 12:01:13 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:07:00.372 12:01:13 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:00.631 12:01:13 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:07:00.631 12:01:13 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:1a:00.0 00:07:00.631 12:01:13 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:07:00.631 12:01:13 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:1a:00.0/device 00:07:00.631 12:01:13 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:07:00.631 12:01:13 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:07:00.631 12:01:13 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:07:00.631 12:01:13 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:1a:00.0 00:07:00.631 12:01:13 -- common/autotest_common.sh@1577 -- # [[ -z 0000:1a:00.0 ]] 00:07:00.631 12:01:13 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=2669752 00:07:00.631 12:01:13 -- common/autotest_common.sh@1583 -- # waitforlisten 2669752 00:07:00.631 12:01:13 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.631 12:01:13 -- common/autotest_common.sh@819 -- # '[' -z 2669752 ']' 00:07:00.631 12:01:13 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:00.631 12:01:13 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:00.631 12:01:13 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:00.631 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:00.631 12:01:13 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:00.631 12:01:13 -- common/autotest_common.sh@10 -- # set +x 00:07:00.631 [2024-06-11 12:01:13.464628] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:00.631 [2024-06-11 12:01:13.464703] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2669752 ] 00:07:00.631 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.631 [2024-06-11 12:01:13.571388] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.631 [2024-06-11 12:01:13.621437] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:00.631 [2024-06-11 12:01:13.621594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.568 12:01:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:01.568 12:01:14 -- common/autotest_common.sh@852 -- # return 0 00:07:01.568 12:01:14 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:07:01.568 12:01:14 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:07:01.568 12:01:14 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:1a:00.0 00:07:04.855 nvme0n1 00:07:04.855 12:01:17 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:07:04.856 [2024-06-11 12:01:17.614066] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:07:04.856 request: 00:07:04.856 { 00:07:04.856 "nvme_ctrlr_name": "nvme0", 00:07:04.856 "password": "test", 00:07:04.856 "method": "bdev_nvme_opal_revert", 00:07:04.856 "req_id": 1 00:07:04.856 } 00:07:04.856 Got JSON-RPC error response 00:07:04.856 response: 00:07:04.856 { 00:07:04.856 "code": -32602, 00:07:04.856 "message": "Invalid parameters" 00:07:04.856 } 00:07:04.856 12:01:17 -- common/autotest_common.sh@1589 -- # true 00:07:04.856 12:01:17 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:07:04.856 12:01:17 -- common/autotest_common.sh@1593 -- # killprocess 2669752 00:07:04.856 12:01:17 -- common/autotest_common.sh@926 -- # '[' -z 2669752 ']' 00:07:04.856 12:01:17 -- common/autotest_common.sh@930 -- # kill -0 2669752 00:07:04.856 12:01:17 -- common/autotest_common.sh@931 -- # uname 00:07:04.856 12:01:17 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:04.856 12:01:17 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2669752 00:07:04.856 12:01:17 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:04.856 12:01:17 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:04.856 12:01:17 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2669752' 00:07:04.856 killing process with pid 2669752 00:07:04.856 12:01:17 -- common/autotest_common.sh@945 -- # kill 2669752 00:07:04.856 12:01:17 -- common/autotest_common.sh@950 -- # wait 2669752 00:07:09.045 12:01:21 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:07:09.045 12:01:21 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:07:09.045 12:01:21 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:07:09.045 12:01:21 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:07:09.045 12:01:21 -- spdk/autotest.sh@173 -- # timing_enter lib 00:07:09.045 12:01:21 -- common/autotest_common.sh@712 -- # xtrace_disable 00:07:09.045 12:01:21 -- common/autotest_common.sh@10 -- # set +x 00:07:09.045 12:01:21 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:07:09.045 12:01:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:09.045 12:01:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.045 12:01:21 -- common/autotest_common.sh@10 -- # set +x 00:07:09.045 ************************************ 00:07:09.045 START TEST env 00:07:09.045 ************************************ 00:07:09.045 12:01:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:07:09.045 * Looking for test storage... 00:07:09.045 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:07:09.045 12:01:21 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:07:09.045 12:01:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:09.045 12:01:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.045 12:01:21 -- common/autotest_common.sh@10 -- # set +x 00:07:09.045 ************************************ 00:07:09.045 START TEST env_memory 00:07:09.045 ************************************ 00:07:09.045 12:01:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:07:09.045 00:07:09.045 00:07:09.045 CUnit - A unit testing framework for C - Version 2.1-3 00:07:09.045 http://cunit.sourceforge.net/ 00:07:09.045 00:07:09.045 00:07:09.045 Suite: memory 00:07:09.045 Test: alloc and free memory map ...[2024-06-11 12:01:21.807517] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:09.045 passed 00:07:09.045 Test: mem map translation ...[2024-06-11 12:01:21.827741] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:09.045 [2024-06-11 12:01:21.827766] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:09.045 [2024-06-11 12:01:21.827812] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:09.045 [2024-06-11 12:01:21.827826] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:09.045 passed 00:07:09.045 Test: mem map registration ...[2024-06-11 12:01:21.862046] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:07:09.045 [2024-06-11 12:01:21.862069] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:07:09.045 passed 00:07:09.045 Test: mem map adjacent registrations ...passed 00:07:09.045 00:07:09.045 Run Summary: Type Total Ran Passed Failed Inactive 00:07:09.045 suites 1 1 n/a 0 0 00:07:09.045 tests 4 4 4 0 0 00:07:09.045 asserts 152 152 152 0 n/a 00:07:09.045 00:07:09.045 Elapsed time = 0.126 seconds 00:07:09.045 00:07:09.045 real 0m0.140s 00:07:09.045 user 0m0.121s 00:07:09.045 sys 0m0.018s 00:07:09.045 12:01:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.045 12:01:21 -- common/autotest_common.sh@10 -- # set +x 00:07:09.045 ************************************ 00:07:09.045 END TEST env_memory 00:07:09.045 ************************************ 00:07:09.045 12:01:21 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:09.045 12:01:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:09.045 12:01:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:09.046 12:01:21 -- common/autotest_common.sh@10 -- # set +x 00:07:09.046 ************************************ 00:07:09.046 START TEST env_vtophys 00:07:09.046 ************************************ 00:07:09.046 12:01:21 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:09.046 EAL: lib.eal log level changed from notice to debug 00:07:09.046 EAL: Detected lcore 0 as core 0 on socket 0 00:07:09.046 EAL: Detected lcore 1 as core 1 on socket 0 00:07:09.046 EAL: Detected lcore 2 as core 2 on socket 0 00:07:09.046 EAL: Detected lcore 3 as core 3 on socket 0 00:07:09.046 EAL: Detected lcore 4 as core 4 on socket 0 00:07:09.046 EAL: Detected lcore 5 as core 8 on socket 0 00:07:09.046 EAL: Detected lcore 6 as core 9 on socket 0 00:07:09.046 EAL: Detected lcore 7 as core 10 on socket 0 00:07:09.046 EAL: Detected lcore 8 as core 11 on socket 0 00:07:09.046 EAL: Detected lcore 9 as core 16 on socket 0 00:07:09.046 EAL: Detected lcore 10 as core 17 on socket 0 00:07:09.046 EAL: Detected lcore 11 as core 18 on socket 0 00:07:09.046 EAL: Detected lcore 12 as core 19 on socket 0 00:07:09.046 EAL: Detected lcore 13 as core 20 on socket 0 00:07:09.046 EAL: Detected lcore 14 as core 24 on socket 0 00:07:09.046 EAL: Detected lcore 15 as core 25 on socket 0 00:07:09.046 EAL: Detected lcore 16 as core 26 on socket 0 00:07:09.046 EAL: Detected lcore 17 as core 27 on socket 0 00:07:09.046 EAL: Detected lcore 18 as core 0 on socket 1 00:07:09.046 EAL: Detected lcore 19 as core 1 on socket 1 00:07:09.046 EAL: Detected lcore 20 as core 2 on socket 1 00:07:09.046 EAL: Detected lcore 21 as core 3 on socket 1 00:07:09.046 EAL: Detected lcore 22 as core 4 on socket 1 00:07:09.046 EAL: Detected lcore 23 as core 8 on socket 1 00:07:09.046 EAL: Detected lcore 24 as core 9 on socket 1 00:07:09.046 EAL: Detected lcore 25 as core 10 on socket 1 00:07:09.046 EAL: Detected lcore 26 as core 11 on socket 1 00:07:09.046 EAL: Detected lcore 27 as core 16 on socket 1 00:07:09.046 EAL: Detected lcore 28 as core 17 on socket 1 00:07:09.046 EAL: Detected lcore 29 as core 18 on socket 1 00:07:09.046 EAL: Detected lcore 30 as core 19 on socket 1 00:07:09.046 EAL: Detected lcore 31 as core 20 on socket 1 00:07:09.046 EAL: Detected lcore 32 as core 24 on socket 1 00:07:09.046 EAL: Detected lcore 33 as core 25 on socket 1 00:07:09.046 EAL: Detected lcore 34 as core 26 on socket 1 00:07:09.046 EAL: Detected lcore 35 as core 27 on socket 1 00:07:09.046 EAL: Detected lcore 36 as core 0 on socket 0 00:07:09.046 EAL: Detected lcore 37 as core 1 on socket 0 00:07:09.046 EAL: Detected lcore 38 as core 2 on socket 0 00:07:09.046 EAL: Detected lcore 39 as core 3 on socket 0 00:07:09.046 EAL: Detected lcore 40 as core 4 on socket 0 00:07:09.046 EAL: Detected lcore 41 as core 8 on socket 0 00:07:09.046 EAL: Detected lcore 42 as core 9 on socket 0 00:07:09.046 EAL: Detected lcore 43 as core 10 on socket 0 00:07:09.046 EAL: Detected lcore 44 as core 11 on socket 0 00:07:09.046 EAL: Detected lcore 45 as core 16 on socket 0 00:07:09.046 EAL: Detected lcore 46 as core 17 on socket 0 00:07:09.046 EAL: Detected lcore 47 as core 18 on socket 0 00:07:09.046 EAL: Detected lcore 48 as core 19 on socket 0 00:07:09.046 EAL: Detected lcore 49 as core 20 on socket 0 00:07:09.046 EAL: Detected lcore 50 as core 24 on socket 0 00:07:09.046 EAL: Detected lcore 51 as core 25 on socket 0 00:07:09.046 EAL: Detected lcore 52 as core 26 on socket 0 00:07:09.046 EAL: Detected lcore 53 as core 27 on socket 0 00:07:09.046 EAL: Detected lcore 54 as core 0 on socket 1 00:07:09.046 EAL: Detected lcore 55 as core 1 on socket 1 00:07:09.046 EAL: Detected lcore 56 as core 2 on socket 1 00:07:09.046 EAL: Detected lcore 57 as core 3 on socket 1 00:07:09.046 EAL: Detected lcore 58 as core 4 on socket 1 00:07:09.046 EAL: Detected lcore 59 as core 8 on socket 1 00:07:09.046 EAL: Detected lcore 60 as core 9 on socket 1 00:07:09.046 EAL: Detected lcore 61 as core 10 on socket 1 00:07:09.046 EAL: Detected lcore 62 as core 11 on socket 1 00:07:09.046 EAL: Detected lcore 63 as core 16 on socket 1 00:07:09.046 EAL: Detected lcore 64 as core 17 on socket 1 00:07:09.046 EAL: Detected lcore 65 as core 18 on socket 1 00:07:09.046 EAL: Detected lcore 66 as core 19 on socket 1 00:07:09.046 EAL: Detected lcore 67 as core 20 on socket 1 00:07:09.046 EAL: Detected lcore 68 as core 24 on socket 1 00:07:09.046 EAL: Detected lcore 69 as core 25 on socket 1 00:07:09.046 EAL: Detected lcore 70 as core 26 on socket 1 00:07:09.046 EAL: Detected lcore 71 as core 27 on socket 1 00:07:09.046 EAL: Maximum logical cores by configuration: 128 00:07:09.046 EAL: Detected CPU lcores: 72 00:07:09.046 EAL: Detected NUMA nodes: 2 00:07:09.046 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:07:09.046 EAL: Checking presence of .so 'librte_eal.so.24' 00:07:09.046 EAL: Checking presence of .so 'librte_eal.so' 00:07:09.046 EAL: Detected static linkage of DPDK 00:07:09.046 EAL: No shared files mode enabled, IPC will be disabled 00:07:09.046 EAL: Bus pci wants IOVA as 'DC' 00:07:09.046 EAL: Buses did not request a specific IOVA mode. 00:07:09.046 EAL: IOMMU is available, selecting IOVA as VA mode. 00:07:09.046 EAL: Selected IOVA mode 'VA' 00:07:09.046 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.046 EAL: Probing VFIO support... 00:07:09.046 EAL: IOMMU type 1 (Type 1) is supported 00:07:09.046 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:09.046 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:09.046 EAL: VFIO support initialized 00:07:09.046 EAL: Ask a virtual area of 0x2e000 bytes 00:07:09.046 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:09.046 EAL: Setting up physically contiguous memory... 00:07:09.046 EAL: Setting maximum number of open files to 524288 00:07:09.046 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:09.046 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:09.046 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:09.046 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:09.046 EAL: Ask a virtual area of 0x61000 bytes 00:07:09.046 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:09.046 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:09.046 EAL: Ask a virtual area of 0x400000000 bytes 00:07:09.046 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:09.046 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:09.046 EAL: Hugepages will be freed exactly as allocated. 00:07:09.046 EAL: No shared files mode enabled, IPC is disabled 00:07:09.046 EAL: No shared files mode enabled, IPC is disabled 00:07:09.046 EAL: TSC frequency is ~2300000 KHz 00:07:09.046 EAL: Main lcore 0 is ready (tid=7f62ed20ca00;cpuset=[0]) 00:07:09.046 EAL: Trying to obtain current memory policy. 00:07:09.046 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.046 EAL: Restoring previous memory policy: 0 00:07:09.046 EAL: request: mp_malloc_sync 00:07:09.046 EAL: No shared files mode enabled, IPC is disabled 00:07:09.046 EAL: Heap on socket 0 was expanded by 2MB 00:07:09.046 EAL: No shared files mode enabled, IPC is disabled 00:07:09.306 EAL: Mem event callback 'spdk:(nil)' registered 00:07:09.306 00:07:09.306 00:07:09.307 CUnit - A unit testing framework for C - Version 2.1-3 00:07:09.307 http://cunit.sourceforge.net/ 00:07:09.307 00:07:09.307 00:07:09.307 Suite: components_suite 00:07:09.307 Test: vtophys_malloc_test ...passed 00:07:09.307 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 4MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 4MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 6MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 6MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 10MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 10MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 18MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 18MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 34MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 34MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 66MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 66MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 130MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was shrunk by 130MB 00:07:09.307 EAL: Trying to obtain current memory policy. 00:07:09.307 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.307 EAL: Restoring previous memory policy: 4 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.307 EAL: request: mp_malloc_sync 00:07:09.307 EAL: No shared files mode enabled, IPC is disabled 00:07:09.307 EAL: Heap on socket 0 was expanded by 258MB 00:07:09.307 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.565 EAL: request: mp_malloc_sync 00:07:09.565 EAL: No shared files mode enabled, IPC is disabled 00:07:09.565 EAL: Heap on socket 0 was shrunk by 258MB 00:07:09.565 EAL: Trying to obtain current memory policy. 00:07:09.565 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:09.565 EAL: Restoring previous memory policy: 4 00:07:09.565 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.565 EAL: request: mp_malloc_sync 00:07:09.565 EAL: No shared files mode enabled, IPC is disabled 00:07:09.565 EAL: Heap on socket 0 was expanded by 514MB 00:07:09.565 EAL: Calling mem event callback 'spdk:(nil)' 00:07:09.824 EAL: request: mp_malloc_sync 00:07:09.824 EAL: No shared files mode enabled, IPC is disabled 00:07:09.824 EAL: Heap on socket 0 was shrunk by 514MB 00:07:09.824 EAL: Trying to obtain current memory policy. 00:07:09.824 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:10.082 EAL: Restoring previous memory policy: 4 00:07:10.082 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.082 EAL: request: mp_malloc_sync 00:07:10.082 EAL: No shared files mode enabled, IPC is disabled 00:07:10.082 EAL: Heap on socket 0 was expanded by 1026MB 00:07:10.341 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.341 EAL: request: mp_malloc_sync 00:07:10.341 EAL: No shared files mode enabled, IPC is disabled 00:07:10.341 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:10.341 passed 00:07:10.341 00:07:10.341 Run Summary: Type Total Ran Passed Failed Inactive 00:07:10.341 suites 1 1 n/a 0 0 00:07:10.341 tests 2 2 2 0 0 00:07:10.341 asserts 497 497 497 0 n/a 00:07:10.341 00:07:10.341 Elapsed time = 1.189 seconds 00:07:10.341 EAL: Calling mem event callback 'spdk:(nil)' 00:07:10.341 EAL: request: mp_malloc_sync 00:07:10.341 EAL: No shared files mode enabled, IPC is disabled 00:07:10.341 EAL: Heap on socket 0 was shrunk by 2MB 00:07:10.341 EAL: No shared files mode enabled, IPC is disabled 00:07:10.341 EAL: No shared files mode enabled, IPC is disabled 00:07:10.341 EAL: No shared files mode enabled, IPC is disabled 00:07:10.341 00:07:10.341 real 0m1.354s 00:07:10.341 user 0m0.760s 00:07:10.341 sys 0m0.561s 00:07:10.341 12:01:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.341 12:01:23 -- common/autotest_common.sh@10 -- # set +x 00:07:10.341 ************************************ 00:07:10.341 END TEST env_vtophys 00:07:10.341 ************************************ 00:07:10.341 12:01:23 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:07:10.341 12:01:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:10.341 12:01:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:10.341 12:01:23 -- common/autotest_common.sh@10 -- # set +x 00:07:10.341 ************************************ 00:07:10.341 START TEST env_pci 00:07:10.341 ************************************ 00:07:10.341 12:01:23 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:07:10.600 00:07:10.600 00:07:10.600 CUnit - A unit testing framework for C - Version 2.1-3 00:07:10.600 http://cunit.sourceforge.net/ 00:07:10.600 00:07:10.600 00:07:10.600 Suite: pci 00:07:10.600 Test: pci_hook ...[2024-06-11 12:01:23.381918] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2671093 has claimed it 00:07:10.600 EAL: Cannot find device (10000:00:01.0) 00:07:10.600 EAL: Failed to attach device on primary process 00:07:10.600 passed 00:07:10.600 00:07:10.600 Run Summary: Type Total Ran Passed Failed Inactive 00:07:10.600 suites 1 1 n/a 0 0 00:07:10.600 tests 1 1 1 0 0 00:07:10.600 asserts 25 25 25 0 n/a 00:07:10.600 00:07:10.600 Elapsed time = 0.040 seconds 00:07:10.600 00:07:10.600 real 0m0.059s 00:07:10.600 user 0m0.011s 00:07:10.600 sys 0m0.048s 00:07:10.600 12:01:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.600 12:01:23 -- common/autotest_common.sh@10 -- # set +x 00:07:10.600 ************************************ 00:07:10.600 END TEST env_pci 00:07:10.600 ************************************ 00:07:10.600 12:01:23 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:10.600 12:01:23 -- env/env.sh@15 -- # uname 00:07:10.600 12:01:23 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:10.600 12:01:23 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:10.600 12:01:23 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:10.600 12:01:23 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:07:10.600 12:01:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:10.600 12:01:23 -- common/autotest_common.sh@10 -- # set +x 00:07:10.600 ************************************ 00:07:10.600 START TEST env_dpdk_post_init 00:07:10.600 ************************************ 00:07:10.600 12:01:23 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:10.600 EAL: Detected CPU lcores: 72 00:07:10.600 EAL: Detected NUMA nodes: 2 00:07:10.600 EAL: Detected static linkage of DPDK 00:07:10.600 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:10.600 EAL: Selected IOVA mode 'VA' 00:07:10.600 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.600 EAL: VFIO support initialized 00:07:10.600 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:10.859 EAL: Using IOMMU type 1 (Type 1) 00:07:11.425 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:1a:00.0 (socket 0) 00:07:16.694 EAL: Releasing PCI mapped resource for 0000:1a:00.0 00:07:16.694 EAL: Calling pci_unmap_resource for 0000:1a:00.0 at 0x202001000000 00:07:17.262 Starting DPDK initialization... 00:07:17.262 Starting SPDK post initialization... 00:07:17.262 SPDK NVMe probe 00:07:17.262 Attaching to 0000:1a:00.0 00:07:17.262 Attached to 0000:1a:00.0 00:07:17.262 Cleaning up... 00:07:17.262 00:07:17.262 real 0m6.528s 00:07:17.262 user 0m4.852s 00:07:17.262 sys 0m0.928s 00:07:17.262 12:01:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.262 12:01:30 -- common/autotest_common.sh@10 -- # set +x 00:07:17.262 ************************************ 00:07:17.262 END TEST env_dpdk_post_init 00:07:17.262 ************************************ 00:07:17.262 12:01:30 -- env/env.sh@26 -- # uname 00:07:17.262 12:01:30 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:17.262 12:01:30 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:17.262 12:01:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:17.262 12:01:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.262 12:01:30 -- common/autotest_common.sh@10 -- # set +x 00:07:17.262 ************************************ 00:07:17.262 START TEST env_mem_callbacks 00:07:17.262 ************************************ 00:07:17.262 12:01:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:17.262 EAL: Detected CPU lcores: 72 00:07:17.262 EAL: Detected NUMA nodes: 2 00:07:17.262 EAL: Detected static linkage of DPDK 00:07:17.262 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:17.262 EAL: Selected IOVA mode 'VA' 00:07:17.262 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.262 EAL: VFIO support initialized 00:07:17.262 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:17.262 00:07:17.262 00:07:17.262 CUnit - A unit testing framework for C - Version 2.1-3 00:07:17.262 http://cunit.sourceforge.net/ 00:07:17.262 00:07:17.262 00:07:17.262 Suite: memory 00:07:17.262 Test: test ... 00:07:17.262 register 0x200000200000 2097152 00:07:17.262 malloc 3145728 00:07:17.262 register 0x200000400000 4194304 00:07:17.262 buf 0x200000500000 len 3145728 PASSED 00:07:17.262 malloc 64 00:07:17.262 buf 0x2000004fff40 len 64 PASSED 00:07:17.262 malloc 4194304 00:07:17.262 register 0x200000800000 6291456 00:07:17.262 buf 0x200000a00000 len 4194304 PASSED 00:07:17.262 free 0x200000500000 3145728 00:07:17.262 free 0x2000004fff40 64 00:07:17.262 unregister 0x200000400000 4194304 PASSED 00:07:17.262 free 0x200000a00000 4194304 00:07:17.262 unregister 0x200000800000 6291456 PASSED 00:07:17.262 malloc 8388608 00:07:17.262 register 0x200000400000 10485760 00:07:17.262 buf 0x200000600000 len 8388608 PASSED 00:07:17.262 free 0x200000600000 8388608 00:07:17.262 unregister 0x200000400000 10485760 PASSED 00:07:17.262 passed 00:07:17.262 00:07:17.262 Run Summary: Type Total Ran Passed Failed Inactive 00:07:17.262 suites 1 1 n/a 0 0 00:07:17.262 tests 1 1 1 0 0 00:07:17.262 asserts 15 15 15 0 n/a 00:07:17.262 00:07:17.262 Elapsed time = 0.008 seconds 00:07:17.262 00:07:17.262 real 0m0.088s 00:07:17.262 user 0m0.026s 00:07:17.262 sys 0m0.062s 00:07:17.262 12:01:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.262 12:01:30 -- common/autotest_common.sh@10 -- # set +x 00:07:17.262 ************************************ 00:07:17.262 END TEST env_mem_callbacks 00:07:17.262 ************************************ 00:07:17.262 00:07:17.262 real 0m8.534s 00:07:17.262 user 0m5.893s 00:07:17.262 sys 0m1.909s 00:07:17.262 12:01:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.262 12:01:30 -- common/autotest_common.sh@10 -- # set +x 00:07:17.262 ************************************ 00:07:17.262 END TEST env 00:07:17.262 ************************************ 00:07:17.262 12:01:30 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:07:17.262 12:01:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:17.262 12:01:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.262 12:01:30 -- common/autotest_common.sh@10 -- # set +x 00:07:17.262 ************************************ 00:07:17.263 START TEST rpc 00:07:17.263 ************************************ 00:07:17.263 12:01:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:07:17.522 * Looking for test storage... 00:07:17.522 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:17.522 12:01:30 -- rpc/rpc.sh@65 -- # spdk_pid=2672166 00:07:17.522 12:01:30 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:17.522 12:01:30 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:17.522 12:01:30 -- rpc/rpc.sh@67 -- # waitforlisten 2672166 00:07:17.522 12:01:30 -- common/autotest_common.sh@819 -- # '[' -z 2672166 ']' 00:07:17.522 12:01:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.522 12:01:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:17.522 12:01:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.522 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.522 12:01:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:17.522 12:01:30 -- common/autotest_common.sh@10 -- # set +x 00:07:17.522 [2024-06-11 12:01:30.365454] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:17.522 [2024-06-11 12:01:30.365540] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672166 ] 00:07:17.522 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.522 [2024-06-11 12:01:30.485345] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.522 [2024-06-11 12:01:30.530724] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:17.522 [2024-06-11 12:01:30.530856] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:17.522 [2024-06-11 12:01:30.530871] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2672166' to capture a snapshot of events at runtime. 00:07:17.522 [2024-06-11 12:01:30.530888] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2672166 for offline analysis/debug. 00:07:17.522 [2024-06-11 12:01:30.530919] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.460 12:01:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:18.460 12:01:31 -- common/autotest_common.sh@852 -- # return 0 00:07:18.460 12:01:31 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:18.460 12:01:31 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:07:18.460 12:01:31 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:18.460 12:01:31 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:18.460 12:01:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:18.460 12:01:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.460 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.460 ************************************ 00:07:18.460 START TEST rpc_integrity 00:07:18.460 ************************************ 00:07:18.460 12:01:31 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:07:18.460 12:01:31 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:18.460 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.460 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.460 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.460 12:01:31 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:18.460 12:01:31 -- rpc/rpc.sh@13 -- # jq length 00:07:18.460 12:01:31 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:18.460 12:01:31 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:18.460 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.460 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.460 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.460 12:01:31 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:18.460 12:01:31 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:18.460 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.460 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.460 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.460 12:01:31 -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:18.460 { 00:07:18.460 "name": "Malloc0", 00:07:18.460 "aliases": [ 00:07:18.460 "6a85af7e-6172-48c8-aa95-f7908e3b3f79" 00:07:18.460 ], 00:07:18.460 "product_name": "Malloc disk", 00:07:18.460 "block_size": 512, 00:07:18.460 "num_blocks": 16384, 00:07:18.460 "uuid": "6a85af7e-6172-48c8-aa95-f7908e3b3f79", 00:07:18.460 "assigned_rate_limits": { 00:07:18.460 "rw_ios_per_sec": 0, 00:07:18.460 "rw_mbytes_per_sec": 0, 00:07:18.460 "r_mbytes_per_sec": 0, 00:07:18.460 "w_mbytes_per_sec": 0 00:07:18.460 }, 00:07:18.460 "claimed": false, 00:07:18.460 "zoned": false, 00:07:18.460 "supported_io_types": { 00:07:18.460 "read": true, 00:07:18.460 "write": true, 00:07:18.460 "unmap": true, 00:07:18.460 "write_zeroes": true, 00:07:18.460 "flush": true, 00:07:18.460 "reset": true, 00:07:18.460 "compare": false, 00:07:18.460 "compare_and_write": false, 00:07:18.460 "abort": true, 00:07:18.460 "nvme_admin": false, 00:07:18.460 "nvme_io": false 00:07:18.460 }, 00:07:18.460 "memory_domains": [ 00:07:18.460 { 00:07:18.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:18.460 "dma_device_type": 2 00:07:18.460 } 00:07:18.460 ], 00:07:18.460 "driver_specific": {} 00:07:18.460 } 00:07:18.460 ]' 00:07:18.460 12:01:31 -- rpc/rpc.sh@17 -- # jq length 00:07:18.460 12:01:31 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:18.460 12:01:31 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:18.460 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.460 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.460 [2024-06-11 12:01:31.454743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:18.460 [2024-06-11 12:01:31.454784] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:18.460 [2024-06-11 12:01:31.454805] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x41a7f90 00:07:18.460 [2024-06-11 12:01:31.454818] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:18.460 [2024-06-11 12:01:31.455963] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:18.460 [2024-06-11 12:01:31.455996] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:18.460 Passthru0 00:07:18.460 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.460 12:01:31 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:18.460 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.460 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.460 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.460 12:01:31 -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:18.460 { 00:07:18.460 "name": "Malloc0", 00:07:18.460 "aliases": [ 00:07:18.460 "6a85af7e-6172-48c8-aa95-f7908e3b3f79" 00:07:18.460 ], 00:07:18.460 "product_name": "Malloc disk", 00:07:18.460 "block_size": 512, 00:07:18.460 "num_blocks": 16384, 00:07:18.460 "uuid": "6a85af7e-6172-48c8-aa95-f7908e3b3f79", 00:07:18.460 "assigned_rate_limits": { 00:07:18.460 "rw_ios_per_sec": 0, 00:07:18.460 "rw_mbytes_per_sec": 0, 00:07:18.460 "r_mbytes_per_sec": 0, 00:07:18.460 "w_mbytes_per_sec": 0 00:07:18.460 }, 00:07:18.460 "claimed": true, 00:07:18.460 "claim_type": "exclusive_write", 00:07:18.460 "zoned": false, 00:07:18.460 "supported_io_types": { 00:07:18.460 "read": true, 00:07:18.460 "write": true, 00:07:18.460 "unmap": true, 00:07:18.460 "write_zeroes": true, 00:07:18.460 "flush": true, 00:07:18.460 "reset": true, 00:07:18.460 "compare": false, 00:07:18.460 "compare_and_write": false, 00:07:18.460 "abort": true, 00:07:18.460 "nvme_admin": false, 00:07:18.460 "nvme_io": false 00:07:18.460 }, 00:07:18.460 "memory_domains": [ 00:07:18.460 { 00:07:18.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:18.460 "dma_device_type": 2 00:07:18.460 } 00:07:18.460 ], 00:07:18.460 "driver_specific": {} 00:07:18.460 }, 00:07:18.460 { 00:07:18.460 "name": "Passthru0", 00:07:18.460 "aliases": [ 00:07:18.460 "9c1d6a27-fd04-5d08-9086-df7b9dab3d99" 00:07:18.460 ], 00:07:18.460 "product_name": "passthru", 00:07:18.460 "block_size": 512, 00:07:18.460 "num_blocks": 16384, 00:07:18.460 "uuid": "9c1d6a27-fd04-5d08-9086-df7b9dab3d99", 00:07:18.460 "assigned_rate_limits": { 00:07:18.460 "rw_ios_per_sec": 0, 00:07:18.460 "rw_mbytes_per_sec": 0, 00:07:18.460 "r_mbytes_per_sec": 0, 00:07:18.460 "w_mbytes_per_sec": 0 00:07:18.460 }, 00:07:18.460 "claimed": false, 00:07:18.460 "zoned": false, 00:07:18.460 "supported_io_types": { 00:07:18.460 "read": true, 00:07:18.460 "write": true, 00:07:18.460 "unmap": true, 00:07:18.460 "write_zeroes": true, 00:07:18.460 "flush": true, 00:07:18.460 "reset": true, 00:07:18.460 "compare": false, 00:07:18.460 "compare_and_write": false, 00:07:18.460 "abort": true, 00:07:18.460 "nvme_admin": false, 00:07:18.460 "nvme_io": false 00:07:18.460 }, 00:07:18.460 "memory_domains": [ 00:07:18.460 { 00:07:18.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:18.460 "dma_device_type": 2 00:07:18.460 } 00:07:18.460 ], 00:07:18.460 "driver_specific": { 00:07:18.460 "passthru": { 00:07:18.460 "name": "Passthru0", 00:07:18.460 "base_bdev_name": "Malloc0" 00:07:18.460 } 00:07:18.460 } 00:07:18.460 } 00:07:18.460 ]' 00:07:18.720 12:01:31 -- rpc/rpc.sh@21 -- # jq length 00:07:18.720 12:01:31 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:18.720 12:01:31 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.720 12:01:31 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.720 12:01:31 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.720 12:01:31 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:18.720 12:01:31 -- rpc/rpc.sh@26 -- # jq length 00:07:18.720 12:01:31 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:18.720 00:07:18.720 real 0m0.296s 00:07:18.720 user 0m0.194s 00:07:18.720 sys 0m0.034s 00:07:18.720 12:01:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 ************************************ 00:07:18.720 END TEST rpc_integrity 00:07:18.720 ************************************ 00:07:18.720 12:01:31 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:18.720 12:01:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:18.720 12:01:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 ************************************ 00:07:18.720 START TEST rpc_plugins 00:07:18.720 ************************************ 00:07:18.720 12:01:31 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:07:18.720 12:01:31 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.720 12:01:31 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:18.720 12:01:31 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.720 12:01:31 -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:18.720 { 00:07:18.720 "name": "Malloc1", 00:07:18.720 "aliases": [ 00:07:18.720 "55e607be-c3a1-4cf5-8fab-b8472ad3524d" 00:07:18.720 ], 00:07:18.720 "product_name": "Malloc disk", 00:07:18.720 "block_size": 4096, 00:07:18.720 "num_blocks": 256, 00:07:18.720 "uuid": "55e607be-c3a1-4cf5-8fab-b8472ad3524d", 00:07:18.720 "assigned_rate_limits": { 00:07:18.720 "rw_ios_per_sec": 0, 00:07:18.720 "rw_mbytes_per_sec": 0, 00:07:18.720 "r_mbytes_per_sec": 0, 00:07:18.720 "w_mbytes_per_sec": 0 00:07:18.720 }, 00:07:18.720 "claimed": false, 00:07:18.720 "zoned": false, 00:07:18.720 "supported_io_types": { 00:07:18.720 "read": true, 00:07:18.720 "write": true, 00:07:18.720 "unmap": true, 00:07:18.720 "write_zeroes": true, 00:07:18.720 "flush": true, 00:07:18.720 "reset": true, 00:07:18.720 "compare": false, 00:07:18.720 "compare_and_write": false, 00:07:18.720 "abort": true, 00:07:18.720 "nvme_admin": false, 00:07:18.720 "nvme_io": false 00:07:18.720 }, 00:07:18.720 "memory_domains": [ 00:07:18.720 { 00:07:18.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:18.720 "dma_device_type": 2 00:07:18.720 } 00:07:18.720 ], 00:07:18.720 "driver_specific": {} 00:07:18.720 } 00:07:18.720 ]' 00:07:18.720 12:01:31 -- rpc/rpc.sh@32 -- # jq length 00:07:18.720 12:01:31 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:18.720 12:01:31 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.720 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.720 12:01:31 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:18.720 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.720 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.979 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.979 12:01:31 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:18.979 12:01:31 -- rpc/rpc.sh@36 -- # jq length 00:07:18.979 12:01:31 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:18.979 00:07:18.979 real 0m0.137s 00:07:18.979 user 0m0.088s 00:07:18.979 sys 0m0.016s 00:07:18.979 12:01:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:18.979 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.979 ************************************ 00:07:18.979 END TEST rpc_plugins 00:07:18.979 ************************************ 00:07:18.979 12:01:31 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:18.979 12:01:31 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:18.979 12:01:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:18.979 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.979 ************************************ 00:07:18.979 START TEST rpc_trace_cmd_test 00:07:18.979 ************************************ 00:07:18.979 12:01:31 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:07:18.979 12:01:31 -- rpc/rpc.sh@40 -- # local info 00:07:18.979 12:01:31 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:18.979 12:01:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:18.979 12:01:31 -- common/autotest_common.sh@10 -- # set +x 00:07:18.979 12:01:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:18.979 12:01:31 -- rpc/rpc.sh@42 -- # info='{ 00:07:18.979 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2672166", 00:07:18.979 "tpoint_group_mask": "0x8", 00:07:18.979 "iscsi_conn": { 00:07:18.979 "mask": "0x2", 00:07:18.979 "tpoint_mask": "0x0" 00:07:18.979 }, 00:07:18.979 "scsi": { 00:07:18.979 "mask": "0x4", 00:07:18.979 "tpoint_mask": "0x0" 00:07:18.979 }, 00:07:18.980 "bdev": { 00:07:18.980 "mask": "0x8", 00:07:18.980 "tpoint_mask": "0xffffffffffffffff" 00:07:18.980 }, 00:07:18.980 "nvmf_rdma": { 00:07:18.980 "mask": "0x10", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "nvmf_tcp": { 00:07:18.980 "mask": "0x20", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "ftl": { 00:07:18.980 "mask": "0x40", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "blobfs": { 00:07:18.980 "mask": "0x80", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "dsa": { 00:07:18.980 "mask": "0x200", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "thread": { 00:07:18.980 "mask": "0x400", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "nvme_pcie": { 00:07:18.980 "mask": "0x800", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "iaa": { 00:07:18.980 "mask": "0x1000", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "nvme_tcp": { 00:07:18.980 "mask": "0x2000", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 }, 00:07:18.980 "bdev_nvme": { 00:07:18.980 "mask": "0x4000", 00:07:18.980 "tpoint_mask": "0x0" 00:07:18.980 } 00:07:18.980 }' 00:07:18.980 12:01:31 -- rpc/rpc.sh@43 -- # jq length 00:07:18.980 12:01:31 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:07:18.980 12:01:31 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:18.980 12:01:31 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:18.980 12:01:31 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:18.980 12:01:31 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:18.980 12:01:31 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:18.980 12:01:32 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:18.980 12:01:32 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:19.243 12:01:32 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:19.243 00:07:19.243 real 0m0.199s 00:07:19.243 user 0m0.156s 00:07:19.243 sys 0m0.036s 00:07:19.243 12:01:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.243 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.243 ************************************ 00:07:19.243 END TEST rpc_trace_cmd_test 00:07:19.243 ************************************ 00:07:19.243 12:01:32 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:19.243 12:01:32 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:19.243 12:01:32 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:19.243 12:01:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:19.243 12:01:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:19.243 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.243 ************************************ 00:07:19.243 START TEST rpc_daemon_integrity 00:07:19.243 ************************************ 00:07:19.243 12:01:32 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:07:19.243 12:01:32 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:19.243 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.243 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.243 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.243 12:01:32 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:19.243 12:01:32 -- rpc/rpc.sh@13 -- # jq length 00:07:19.243 12:01:32 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:19.243 12:01:32 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:19.243 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.243 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.243 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.243 12:01:32 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:19.243 12:01:32 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:19.243 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.243 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.243 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.243 12:01:32 -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:19.243 { 00:07:19.243 "name": "Malloc2", 00:07:19.243 "aliases": [ 00:07:19.243 "47f2d347-f37c-43be-b929-28a2b9b68ae7" 00:07:19.243 ], 00:07:19.243 "product_name": "Malloc disk", 00:07:19.243 "block_size": 512, 00:07:19.243 "num_blocks": 16384, 00:07:19.243 "uuid": "47f2d347-f37c-43be-b929-28a2b9b68ae7", 00:07:19.243 "assigned_rate_limits": { 00:07:19.243 "rw_ios_per_sec": 0, 00:07:19.243 "rw_mbytes_per_sec": 0, 00:07:19.243 "r_mbytes_per_sec": 0, 00:07:19.243 "w_mbytes_per_sec": 0 00:07:19.243 }, 00:07:19.243 "claimed": false, 00:07:19.243 "zoned": false, 00:07:19.243 "supported_io_types": { 00:07:19.243 "read": true, 00:07:19.243 "write": true, 00:07:19.243 "unmap": true, 00:07:19.243 "write_zeroes": true, 00:07:19.243 "flush": true, 00:07:19.243 "reset": true, 00:07:19.243 "compare": false, 00:07:19.243 "compare_and_write": false, 00:07:19.243 "abort": true, 00:07:19.243 "nvme_admin": false, 00:07:19.243 "nvme_io": false 00:07:19.243 }, 00:07:19.243 "memory_domains": [ 00:07:19.243 { 00:07:19.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.243 "dma_device_type": 2 00:07:19.243 } 00:07:19.243 ], 00:07:19.243 "driver_specific": {} 00:07:19.243 } 00:07:19.243 ]' 00:07:19.243 12:01:32 -- rpc/rpc.sh@17 -- # jq length 00:07:19.243 12:01:32 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:19.243 12:01:32 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:19.244 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.244 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.244 [2024-06-11 12:01:32.196738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:19.244 [2024-06-11 12:01:32.196778] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:19.244 [2024-06-11 12:01:32.196800] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x41a77b0 00:07:19.244 [2024-06-11 12:01:32.196813] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:19.244 [2024-06-11 12:01:32.197778] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:19.244 [2024-06-11 12:01:32.197807] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:19.244 Passthru0 00:07:19.244 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.244 12:01:32 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:19.244 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.244 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.244 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.244 12:01:32 -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:19.244 { 00:07:19.244 "name": "Malloc2", 00:07:19.244 "aliases": [ 00:07:19.244 "47f2d347-f37c-43be-b929-28a2b9b68ae7" 00:07:19.244 ], 00:07:19.244 "product_name": "Malloc disk", 00:07:19.244 "block_size": 512, 00:07:19.244 "num_blocks": 16384, 00:07:19.244 "uuid": "47f2d347-f37c-43be-b929-28a2b9b68ae7", 00:07:19.244 "assigned_rate_limits": { 00:07:19.244 "rw_ios_per_sec": 0, 00:07:19.244 "rw_mbytes_per_sec": 0, 00:07:19.244 "r_mbytes_per_sec": 0, 00:07:19.244 "w_mbytes_per_sec": 0 00:07:19.244 }, 00:07:19.244 "claimed": true, 00:07:19.244 "claim_type": "exclusive_write", 00:07:19.244 "zoned": false, 00:07:19.244 "supported_io_types": { 00:07:19.244 "read": true, 00:07:19.244 "write": true, 00:07:19.244 "unmap": true, 00:07:19.244 "write_zeroes": true, 00:07:19.244 "flush": true, 00:07:19.244 "reset": true, 00:07:19.244 "compare": false, 00:07:19.244 "compare_and_write": false, 00:07:19.244 "abort": true, 00:07:19.244 "nvme_admin": false, 00:07:19.244 "nvme_io": false 00:07:19.244 }, 00:07:19.244 "memory_domains": [ 00:07:19.244 { 00:07:19.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.244 "dma_device_type": 2 00:07:19.244 } 00:07:19.244 ], 00:07:19.244 "driver_specific": {} 00:07:19.244 }, 00:07:19.244 { 00:07:19.244 "name": "Passthru0", 00:07:19.244 "aliases": [ 00:07:19.244 "459a6ec3-6b05-5520-abe9-c177d8d1e7f9" 00:07:19.244 ], 00:07:19.244 "product_name": "passthru", 00:07:19.244 "block_size": 512, 00:07:19.244 "num_blocks": 16384, 00:07:19.244 "uuid": "459a6ec3-6b05-5520-abe9-c177d8d1e7f9", 00:07:19.244 "assigned_rate_limits": { 00:07:19.244 "rw_ios_per_sec": 0, 00:07:19.244 "rw_mbytes_per_sec": 0, 00:07:19.244 "r_mbytes_per_sec": 0, 00:07:19.244 "w_mbytes_per_sec": 0 00:07:19.244 }, 00:07:19.244 "claimed": false, 00:07:19.244 "zoned": false, 00:07:19.244 "supported_io_types": { 00:07:19.244 "read": true, 00:07:19.244 "write": true, 00:07:19.244 "unmap": true, 00:07:19.244 "write_zeroes": true, 00:07:19.244 "flush": true, 00:07:19.244 "reset": true, 00:07:19.244 "compare": false, 00:07:19.244 "compare_and_write": false, 00:07:19.244 "abort": true, 00:07:19.244 "nvme_admin": false, 00:07:19.244 "nvme_io": false 00:07:19.244 }, 00:07:19.244 "memory_domains": [ 00:07:19.244 { 00:07:19.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:19.244 "dma_device_type": 2 00:07:19.244 } 00:07:19.244 ], 00:07:19.244 "driver_specific": { 00:07:19.244 "passthru": { 00:07:19.244 "name": "Passthru0", 00:07:19.244 "base_bdev_name": "Malloc2" 00:07:19.244 } 00:07:19.244 } 00:07:19.244 } 00:07:19.244 ]' 00:07:19.244 12:01:32 -- rpc/rpc.sh@21 -- # jq length 00:07:19.505 12:01:32 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:19.505 12:01:32 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:19.505 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.505 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.505 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.505 12:01:32 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:19.505 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.505 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.505 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.505 12:01:32 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:19.505 12:01:32 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:19.505 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.505 12:01:32 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:19.505 12:01:32 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:19.505 12:01:32 -- rpc/rpc.sh@26 -- # jq length 00:07:19.505 12:01:32 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:19.505 00:07:19.505 real 0m0.247s 00:07:19.505 user 0m0.141s 00:07:19.505 sys 0m0.045s 00:07:19.505 12:01:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.505 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.505 ************************************ 00:07:19.505 END TEST rpc_daemon_integrity 00:07:19.505 ************************************ 00:07:19.505 12:01:32 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:19.505 12:01:32 -- rpc/rpc.sh@84 -- # killprocess 2672166 00:07:19.505 12:01:32 -- common/autotest_common.sh@926 -- # '[' -z 2672166 ']' 00:07:19.505 12:01:32 -- common/autotest_common.sh@930 -- # kill -0 2672166 00:07:19.505 12:01:32 -- common/autotest_common.sh@931 -- # uname 00:07:19.505 12:01:32 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:19.505 12:01:32 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2672166 00:07:19.505 12:01:32 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:19.505 12:01:32 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:19.505 12:01:32 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2672166' 00:07:19.505 killing process with pid 2672166 00:07:19.505 12:01:32 -- common/autotest_common.sh@945 -- # kill 2672166 00:07:19.505 12:01:32 -- common/autotest_common.sh@950 -- # wait 2672166 00:07:19.764 00:07:19.764 real 0m2.543s 00:07:19.764 user 0m3.194s 00:07:19.764 sys 0m0.778s 00:07:19.764 12:01:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:19.764 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:19.764 ************************************ 00:07:19.764 END TEST rpc 00:07:19.764 ************************************ 00:07:20.022 12:01:32 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:20.022 12:01:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.022 12:01:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.022 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:20.022 ************************************ 00:07:20.022 START TEST rpc_client 00:07:20.022 ************************************ 00:07:20.022 12:01:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:20.022 * Looking for test storage... 00:07:20.022 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:20.022 12:01:32 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:20.022 OK 00:07:20.022 12:01:32 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:20.022 00:07:20.022 real 0m0.112s 00:07:20.022 user 0m0.036s 00:07:20.022 sys 0m0.084s 00:07:20.022 12:01:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.022 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:20.022 ************************************ 00:07:20.022 END TEST rpc_client 00:07:20.022 ************************************ 00:07:20.023 12:01:32 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:20.023 12:01:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.023 12:01:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.023 12:01:32 -- common/autotest_common.sh@10 -- # set +x 00:07:20.023 ************************************ 00:07:20.023 START TEST json_config 00:07:20.023 ************************************ 00:07:20.023 12:01:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:20.282 12:01:33 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:20.282 12:01:33 -- nvmf/common.sh@7 -- # uname -s 00:07:20.282 12:01:33 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:20.282 12:01:33 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:20.282 12:01:33 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:20.282 12:01:33 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:20.282 12:01:33 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:20.282 12:01:33 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:20.282 12:01:33 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:20.282 12:01:33 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:20.282 12:01:33 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:20.282 12:01:33 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:20.282 12:01:33 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:07:20.282 12:01:33 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:07:20.282 12:01:33 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:20.282 12:01:33 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:20.282 12:01:33 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:20.282 12:01:33 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:20.282 12:01:33 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:20.282 12:01:33 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:20.282 12:01:33 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:20.282 12:01:33 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.282 12:01:33 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.282 12:01:33 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.282 12:01:33 -- paths/export.sh@5 -- # export PATH 00:07:20.282 12:01:33 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.282 12:01:33 -- nvmf/common.sh@46 -- # : 0 00:07:20.282 12:01:33 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:07:20.282 12:01:33 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:07:20.283 12:01:33 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:07:20.283 12:01:33 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:20.283 12:01:33 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:20.283 12:01:33 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:07:20.283 12:01:33 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:07:20.283 12:01:33 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:07:20.283 12:01:33 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:07:20.283 12:01:33 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:07:20.283 12:01:33 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:07:20.283 12:01:33 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:20.283 12:01:33 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:20.283 WARNING: No tests are enabled so not running JSON configuration tests 00:07:20.283 12:01:33 -- json_config/json_config.sh@27 -- # exit 0 00:07:20.283 00:07:20.283 real 0m0.102s 00:07:20.283 user 0m0.049s 00:07:20.283 sys 0m0.053s 00:07:20.283 12:01:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.283 12:01:33 -- common/autotest_common.sh@10 -- # set +x 00:07:20.283 ************************************ 00:07:20.283 END TEST json_config 00:07:20.283 ************************************ 00:07:20.283 12:01:33 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:20.283 12:01:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:20.283 12:01:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.283 12:01:33 -- common/autotest_common.sh@10 -- # set +x 00:07:20.283 ************************************ 00:07:20.283 START TEST json_config_extra_key 00:07:20.283 ************************************ 00:07:20.283 12:01:33 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:20.283 12:01:33 -- nvmf/common.sh@7 -- # uname -s 00:07:20.283 12:01:33 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:20.283 12:01:33 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:20.283 12:01:33 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:20.283 12:01:33 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:20.283 12:01:33 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:20.283 12:01:33 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:20.283 12:01:33 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:20.283 12:01:33 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:20.283 12:01:33 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:20.283 12:01:33 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:20.283 12:01:33 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:07:20.283 12:01:33 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:07:20.283 12:01:33 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:20.283 12:01:33 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:20.283 12:01:33 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:20.283 12:01:33 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:20.283 12:01:33 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:20.283 12:01:33 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:20.283 12:01:33 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:20.283 12:01:33 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.283 12:01:33 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.283 12:01:33 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.283 12:01:33 -- paths/export.sh@5 -- # export PATH 00:07:20.283 12:01:33 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:20.283 12:01:33 -- nvmf/common.sh@46 -- # : 0 00:07:20.283 12:01:33 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:07:20.283 12:01:33 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:07:20.283 12:01:33 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:07:20.283 12:01:33 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:20.283 12:01:33 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:20.283 12:01:33 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:07:20.283 12:01:33 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:07:20.283 12:01:33 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:07:20.283 INFO: launching applications... 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@25 -- # shift 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=2672854 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:07:20.283 Waiting for target to run... 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 2672854 /var/tmp/spdk_tgt.sock 00:07:20.283 12:01:33 -- common/autotest_common.sh@819 -- # '[' -z 2672854 ']' 00:07:20.283 12:01:33 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:20.283 12:01:33 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:20.283 12:01:33 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:20.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:20.283 12:01:33 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:20.283 12:01:33 -- common/autotest_common.sh@10 -- # set +x 00:07:20.283 12:01:33 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:20.283 [2024-06-11 12:01:33.246825] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:20.283 [2024-06-11 12:01:33.246894] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672854 ] 00:07:20.283 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.927 [2024-06-11 12:01:33.606747] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.927 [2024-06-11 12:01:33.633723] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:20.927 [2024-06-11 12:01:33.633850] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.186 12:01:34 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:21.186 12:01:34 -- common/autotest_common.sh@852 -- # return 0 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:07:21.186 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:07:21.186 INFO: shutting down applications... 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 2672854 ]] 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 2672854 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2672854 00:07:21.186 12:01:34 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@50 -- # kill -0 2672854 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@52 -- # break 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:07:21.754 SPDK target shutdown done 00:07:21.754 12:01:34 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:07:21.754 Success 00:07:21.754 00:07:21.754 real 0m1.557s 00:07:21.754 user 0m1.393s 00:07:21.754 sys 0m0.458s 00:07:21.754 12:01:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:21.754 12:01:34 -- common/autotest_common.sh@10 -- # set +x 00:07:21.754 ************************************ 00:07:21.754 END TEST json_config_extra_key 00:07:21.754 ************************************ 00:07:21.754 12:01:34 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:21.754 12:01:34 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:21.754 12:01:34 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:21.754 12:01:34 -- common/autotest_common.sh@10 -- # set +x 00:07:21.754 ************************************ 00:07:21.754 START TEST alias_rpc 00:07:21.754 ************************************ 00:07:21.754 12:01:34 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:22.013 * Looking for test storage... 00:07:22.013 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:22.013 12:01:34 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:22.013 12:01:34 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2673079 00:07:22.013 12:01:34 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2673079 00:07:22.013 12:01:34 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:22.013 12:01:34 -- common/autotest_common.sh@819 -- # '[' -z 2673079 ']' 00:07:22.013 12:01:34 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.013 12:01:34 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:22.013 12:01:34 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.013 12:01:34 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:22.013 12:01:34 -- common/autotest_common.sh@10 -- # set +x 00:07:22.013 [2024-06-11 12:01:34.872744] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:22.013 [2024-06-11 12:01:34.872847] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673079 ] 00:07:22.013 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.013 [2024-06-11 12:01:34.992778] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.013 [2024-06-11 12:01:35.037622] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:22.013 [2024-06-11 12:01:35.037771] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.951 12:01:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:22.951 12:01:35 -- common/autotest_common.sh@852 -- # return 0 00:07:22.951 12:01:35 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:23.210 12:01:36 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2673079 00:07:23.210 12:01:36 -- common/autotest_common.sh@926 -- # '[' -z 2673079 ']' 00:07:23.210 12:01:36 -- common/autotest_common.sh@930 -- # kill -0 2673079 00:07:23.210 12:01:36 -- common/autotest_common.sh@931 -- # uname 00:07:23.210 12:01:36 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:23.210 12:01:36 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2673079 00:07:23.210 12:01:36 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:23.210 12:01:36 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:23.210 12:01:36 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2673079' 00:07:23.210 killing process with pid 2673079 00:07:23.210 12:01:36 -- common/autotest_common.sh@945 -- # kill 2673079 00:07:23.210 12:01:36 -- common/autotest_common.sh@950 -- # wait 2673079 00:07:23.469 00:07:23.469 real 0m1.733s 00:07:23.469 user 0m1.934s 00:07:23.469 sys 0m0.535s 00:07:23.469 12:01:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.469 12:01:36 -- common/autotest_common.sh@10 -- # set +x 00:07:23.469 ************************************ 00:07:23.469 END TEST alias_rpc 00:07:23.469 ************************************ 00:07:23.728 12:01:36 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:07:23.729 12:01:36 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:23.729 12:01:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:23.729 12:01:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:23.729 12:01:36 -- common/autotest_common.sh@10 -- # set +x 00:07:23.729 ************************************ 00:07:23.729 START TEST spdkcli_tcp 00:07:23.729 ************************************ 00:07:23.729 12:01:36 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:23.729 * Looking for test storage... 00:07:23.729 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:23.729 12:01:36 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:23.729 12:01:36 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:23.729 12:01:36 -- common/autotest_common.sh@712 -- # xtrace_disable 00:07:23.729 12:01:36 -- common/autotest_common.sh@10 -- # set +x 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2673322 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@27 -- # waitforlisten 2673322 00:07:23.729 12:01:36 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:23.729 12:01:36 -- common/autotest_common.sh@819 -- # '[' -z 2673322 ']' 00:07:23.729 12:01:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.729 12:01:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:23.729 12:01:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.729 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.729 12:01:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:23.729 12:01:36 -- common/autotest_common.sh@10 -- # set +x 00:07:23.729 [2024-06-11 12:01:36.668084] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:23.729 [2024-06-11 12:01:36.668185] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673322 ] 00:07:23.729 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.988 [2024-06-11 12:01:36.773557] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:23.988 [2024-06-11 12:01:36.819509] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:23.988 [2024-06-11 12:01:36.819688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.988 [2024-06-11 12:01:36.819692] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.556 12:01:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:24.556 12:01:37 -- common/autotest_common.sh@852 -- # return 0 00:07:24.556 12:01:37 -- spdkcli/tcp.sh@31 -- # socat_pid=2673499 00:07:24.556 12:01:37 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:24.556 12:01:37 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:24.815 [ 00:07:24.815 "spdk_get_version", 00:07:24.815 "rpc_get_methods", 00:07:24.815 "trace_get_info", 00:07:24.815 "trace_get_tpoint_group_mask", 00:07:24.815 "trace_disable_tpoint_group", 00:07:24.816 "trace_enable_tpoint_group", 00:07:24.816 "trace_clear_tpoint_mask", 00:07:24.816 "trace_set_tpoint_mask", 00:07:24.816 "vfu_tgt_set_base_path", 00:07:24.816 "framework_get_pci_devices", 00:07:24.816 "framework_get_config", 00:07:24.816 "framework_get_subsystems", 00:07:24.816 "iobuf_get_stats", 00:07:24.816 "iobuf_set_options", 00:07:24.816 "sock_set_default_impl", 00:07:24.816 "sock_impl_set_options", 00:07:24.816 "sock_impl_get_options", 00:07:24.816 "vmd_rescan", 00:07:24.816 "vmd_remove_device", 00:07:24.816 "vmd_enable", 00:07:24.816 "accel_get_stats", 00:07:24.816 "accel_set_options", 00:07:24.816 "accel_set_driver", 00:07:24.816 "accel_crypto_key_destroy", 00:07:24.816 "accel_crypto_keys_get", 00:07:24.816 "accel_crypto_key_create", 00:07:24.816 "accel_assign_opc", 00:07:24.816 "accel_get_module_info", 00:07:24.816 "accel_get_opc_assignments", 00:07:24.816 "notify_get_notifications", 00:07:24.816 "notify_get_types", 00:07:24.816 "bdev_get_histogram", 00:07:24.816 "bdev_enable_histogram", 00:07:24.816 "bdev_set_qos_limit", 00:07:24.816 "bdev_set_qd_sampling_period", 00:07:24.816 "bdev_get_bdevs", 00:07:24.816 "bdev_reset_iostat", 00:07:24.816 "bdev_get_iostat", 00:07:24.816 "bdev_examine", 00:07:24.816 "bdev_wait_for_examine", 00:07:24.816 "bdev_set_options", 00:07:24.816 "scsi_get_devices", 00:07:24.816 "thread_set_cpumask", 00:07:24.816 "framework_get_scheduler", 00:07:24.816 "framework_set_scheduler", 00:07:24.816 "framework_get_reactors", 00:07:24.816 "thread_get_io_channels", 00:07:24.816 "thread_get_pollers", 00:07:24.816 "thread_get_stats", 00:07:24.816 "framework_monitor_context_switch", 00:07:24.816 "spdk_kill_instance", 00:07:24.816 "log_enable_timestamps", 00:07:24.816 "log_get_flags", 00:07:24.816 "log_clear_flag", 00:07:24.816 "log_set_flag", 00:07:24.816 "log_get_level", 00:07:24.816 "log_set_level", 00:07:24.816 "log_get_print_level", 00:07:24.816 "log_set_print_level", 00:07:24.816 "framework_enable_cpumask_locks", 00:07:24.816 "framework_disable_cpumask_locks", 00:07:24.816 "framework_wait_init", 00:07:24.816 "framework_start_init", 00:07:24.816 "virtio_blk_create_transport", 00:07:24.816 "virtio_blk_get_transports", 00:07:24.816 "vhost_controller_set_coalescing", 00:07:24.816 "vhost_get_controllers", 00:07:24.816 "vhost_delete_controller", 00:07:24.816 "vhost_create_blk_controller", 00:07:24.816 "vhost_scsi_controller_remove_target", 00:07:24.816 "vhost_scsi_controller_add_target", 00:07:24.816 "vhost_start_scsi_controller", 00:07:24.816 "vhost_create_scsi_controller", 00:07:24.816 "ublk_recover_disk", 00:07:24.816 "ublk_get_disks", 00:07:24.816 "ublk_stop_disk", 00:07:24.816 "ublk_start_disk", 00:07:24.816 "ublk_destroy_target", 00:07:24.816 "ublk_create_target", 00:07:24.816 "nbd_get_disks", 00:07:24.816 "nbd_stop_disk", 00:07:24.816 "nbd_start_disk", 00:07:24.816 "env_dpdk_get_mem_stats", 00:07:24.816 "nvmf_subsystem_get_listeners", 00:07:24.816 "nvmf_subsystem_get_qpairs", 00:07:24.816 "nvmf_subsystem_get_controllers", 00:07:24.816 "nvmf_get_stats", 00:07:24.816 "nvmf_get_transports", 00:07:24.816 "nvmf_create_transport", 00:07:24.816 "nvmf_get_targets", 00:07:24.816 "nvmf_delete_target", 00:07:24.816 "nvmf_create_target", 00:07:24.816 "nvmf_subsystem_allow_any_host", 00:07:24.816 "nvmf_subsystem_remove_host", 00:07:24.816 "nvmf_subsystem_add_host", 00:07:24.816 "nvmf_subsystem_remove_ns", 00:07:24.816 "nvmf_subsystem_add_ns", 00:07:24.816 "nvmf_subsystem_listener_set_ana_state", 00:07:24.816 "nvmf_discovery_get_referrals", 00:07:24.816 "nvmf_discovery_remove_referral", 00:07:24.816 "nvmf_discovery_add_referral", 00:07:24.816 "nvmf_subsystem_remove_listener", 00:07:24.816 "nvmf_subsystem_add_listener", 00:07:24.816 "nvmf_delete_subsystem", 00:07:24.816 "nvmf_create_subsystem", 00:07:24.816 "nvmf_get_subsystems", 00:07:24.816 "nvmf_set_crdt", 00:07:24.816 "nvmf_set_config", 00:07:24.816 "nvmf_set_max_subsystems", 00:07:24.816 "iscsi_set_options", 00:07:24.816 "iscsi_get_auth_groups", 00:07:24.816 "iscsi_auth_group_remove_secret", 00:07:24.816 "iscsi_auth_group_add_secret", 00:07:24.816 "iscsi_delete_auth_group", 00:07:24.816 "iscsi_create_auth_group", 00:07:24.816 "iscsi_set_discovery_auth", 00:07:24.816 "iscsi_get_options", 00:07:24.816 "iscsi_target_node_request_logout", 00:07:24.816 "iscsi_target_node_set_redirect", 00:07:24.816 "iscsi_target_node_set_auth", 00:07:24.816 "iscsi_target_node_add_lun", 00:07:24.816 "iscsi_get_connections", 00:07:24.816 "iscsi_portal_group_set_auth", 00:07:24.816 "iscsi_start_portal_group", 00:07:24.816 "iscsi_delete_portal_group", 00:07:24.816 "iscsi_create_portal_group", 00:07:24.816 "iscsi_get_portal_groups", 00:07:24.816 "iscsi_delete_target_node", 00:07:24.816 "iscsi_target_node_remove_pg_ig_maps", 00:07:24.816 "iscsi_target_node_add_pg_ig_maps", 00:07:24.816 "iscsi_create_target_node", 00:07:24.816 "iscsi_get_target_nodes", 00:07:24.816 "iscsi_delete_initiator_group", 00:07:24.816 "iscsi_initiator_group_remove_initiators", 00:07:24.816 "iscsi_initiator_group_add_initiators", 00:07:24.816 "iscsi_create_initiator_group", 00:07:24.816 "iscsi_get_initiator_groups", 00:07:24.816 "vfu_virtio_create_scsi_endpoint", 00:07:24.816 "vfu_virtio_scsi_remove_target", 00:07:24.816 "vfu_virtio_scsi_add_target", 00:07:24.816 "vfu_virtio_create_blk_endpoint", 00:07:24.816 "vfu_virtio_delete_endpoint", 00:07:24.816 "iaa_scan_accel_module", 00:07:24.816 "dsa_scan_accel_module", 00:07:24.816 "ioat_scan_accel_module", 00:07:24.816 "accel_error_inject_error", 00:07:24.816 "bdev_iscsi_delete", 00:07:24.816 "bdev_iscsi_create", 00:07:24.816 "bdev_iscsi_set_options", 00:07:24.816 "bdev_virtio_attach_controller", 00:07:24.816 "bdev_virtio_scsi_get_devices", 00:07:24.816 "bdev_virtio_detach_controller", 00:07:24.816 "bdev_virtio_blk_set_hotplug", 00:07:24.816 "bdev_ftl_set_property", 00:07:24.816 "bdev_ftl_get_properties", 00:07:24.816 "bdev_ftl_get_stats", 00:07:24.816 "bdev_ftl_unmap", 00:07:24.816 "bdev_ftl_unload", 00:07:24.816 "bdev_ftl_delete", 00:07:24.816 "bdev_ftl_load", 00:07:24.816 "bdev_ftl_create", 00:07:24.816 "bdev_aio_delete", 00:07:24.816 "bdev_aio_rescan", 00:07:24.816 "bdev_aio_create", 00:07:24.816 "blobfs_create", 00:07:24.816 "blobfs_detect", 00:07:24.816 "blobfs_set_cache_size", 00:07:24.816 "bdev_zone_block_delete", 00:07:24.816 "bdev_zone_block_create", 00:07:24.816 "bdev_delay_delete", 00:07:24.816 "bdev_delay_create", 00:07:24.816 "bdev_delay_update_latency", 00:07:24.816 "bdev_split_delete", 00:07:24.816 "bdev_split_create", 00:07:24.816 "bdev_error_inject_error", 00:07:24.816 "bdev_error_delete", 00:07:24.816 "bdev_error_create", 00:07:24.816 "bdev_raid_set_options", 00:07:24.816 "bdev_raid_remove_base_bdev", 00:07:24.816 "bdev_raid_add_base_bdev", 00:07:24.816 "bdev_raid_delete", 00:07:24.816 "bdev_raid_create", 00:07:24.816 "bdev_raid_get_bdevs", 00:07:24.816 "bdev_lvol_grow_lvstore", 00:07:24.816 "bdev_lvol_get_lvols", 00:07:24.816 "bdev_lvol_get_lvstores", 00:07:24.816 "bdev_lvol_delete", 00:07:24.816 "bdev_lvol_set_read_only", 00:07:24.816 "bdev_lvol_resize", 00:07:24.816 "bdev_lvol_decouple_parent", 00:07:24.816 "bdev_lvol_inflate", 00:07:24.816 "bdev_lvol_rename", 00:07:24.816 "bdev_lvol_clone_bdev", 00:07:24.816 "bdev_lvol_clone", 00:07:24.816 "bdev_lvol_snapshot", 00:07:24.816 "bdev_lvol_create", 00:07:24.816 "bdev_lvol_delete_lvstore", 00:07:24.816 "bdev_lvol_rename_lvstore", 00:07:24.816 "bdev_lvol_create_lvstore", 00:07:24.816 "bdev_passthru_delete", 00:07:24.816 "bdev_passthru_create", 00:07:24.816 "bdev_nvme_cuse_unregister", 00:07:24.816 "bdev_nvme_cuse_register", 00:07:24.816 "bdev_opal_new_user", 00:07:24.816 "bdev_opal_set_lock_state", 00:07:24.816 "bdev_opal_delete", 00:07:24.816 "bdev_opal_get_info", 00:07:24.816 "bdev_opal_create", 00:07:24.816 "bdev_nvme_opal_revert", 00:07:24.816 "bdev_nvme_opal_init", 00:07:24.816 "bdev_nvme_send_cmd", 00:07:24.816 "bdev_nvme_get_path_iostat", 00:07:24.816 "bdev_nvme_get_mdns_discovery_info", 00:07:24.816 "bdev_nvme_stop_mdns_discovery", 00:07:24.816 "bdev_nvme_start_mdns_discovery", 00:07:24.816 "bdev_nvme_set_multipath_policy", 00:07:24.816 "bdev_nvme_set_preferred_path", 00:07:24.816 "bdev_nvme_get_io_paths", 00:07:24.816 "bdev_nvme_remove_error_injection", 00:07:24.816 "bdev_nvme_add_error_injection", 00:07:24.816 "bdev_nvme_get_discovery_info", 00:07:24.816 "bdev_nvme_stop_discovery", 00:07:24.816 "bdev_nvme_start_discovery", 00:07:24.816 "bdev_nvme_get_controller_health_info", 00:07:24.816 "bdev_nvme_disable_controller", 00:07:24.816 "bdev_nvme_enable_controller", 00:07:24.816 "bdev_nvme_reset_controller", 00:07:24.816 "bdev_nvme_get_transport_statistics", 00:07:24.816 "bdev_nvme_apply_firmware", 00:07:24.816 "bdev_nvme_detach_controller", 00:07:24.816 "bdev_nvme_get_controllers", 00:07:24.816 "bdev_nvme_attach_controller", 00:07:24.816 "bdev_nvme_set_hotplug", 00:07:24.816 "bdev_nvme_set_options", 00:07:24.816 "bdev_null_resize", 00:07:24.816 "bdev_null_delete", 00:07:24.816 "bdev_null_create", 00:07:24.816 "bdev_malloc_delete", 00:07:24.816 "bdev_malloc_create" 00:07:24.816 ] 00:07:24.816 12:01:37 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:24.816 12:01:37 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:24.816 12:01:37 -- common/autotest_common.sh@10 -- # set +x 00:07:24.816 12:01:37 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:24.817 12:01:37 -- spdkcli/tcp.sh@38 -- # killprocess 2673322 00:07:24.817 12:01:37 -- common/autotest_common.sh@926 -- # '[' -z 2673322 ']' 00:07:24.817 12:01:37 -- common/autotest_common.sh@930 -- # kill -0 2673322 00:07:24.817 12:01:37 -- common/autotest_common.sh@931 -- # uname 00:07:24.817 12:01:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:24.817 12:01:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2673322 00:07:24.817 12:01:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:24.817 12:01:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:24.817 12:01:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2673322' 00:07:24.817 killing process with pid 2673322 00:07:24.817 12:01:37 -- common/autotest_common.sh@945 -- # kill 2673322 00:07:24.817 12:01:37 -- common/autotest_common.sh@950 -- # wait 2673322 00:07:25.386 00:07:25.386 real 0m1.612s 00:07:25.386 user 0m2.967s 00:07:25.386 sys 0m0.545s 00:07:25.386 12:01:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.386 12:01:38 -- common/autotest_common.sh@10 -- # set +x 00:07:25.386 ************************************ 00:07:25.386 END TEST spdkcli_tcp 00:07:25.386 ************************************ 00:07:25.386 12:01:38 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:25.386 12:01:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:25.386 12:01:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.386 12:01:38 -- common/autotest_common.sh@10 -- # set +x 00:07:25.386 ************************************ 00:07:25.386 START TEST dpdk_mem_utility 00:07:25.386 ************************************ 00:07:25.386 12:01:38 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:25.386 * Looking for test storage... 00:07:25.386 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:25.386 12:01:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:25.386 12:01:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2673585 00:07:25.386 12:01:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2673585 00:07:25.386 12:01:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.386 12:01:38 -- common/autotest_common.sh@819 -- # '[' -z 2673585 ']' 00:07:25.386 12:01:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.386 12:01:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:25.386 12:01:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.386 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.386 12:01:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:25.386 12:01:38 -- common/autotest_common.sh@10 -- # set +x 00:07:25.386 [2024-06-11 12:01:38.310498] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:25.386 [2024-06-11 12:01:38.310589] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673585 ] 00:07:25.386 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.645 [2024-06-11 12:01:38.429540] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.645 [2024-06-11 12:01:38.478331] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:25.645 [2024-06-11 12:01:38.478488] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.583 12:01:39 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:26.583 12:01:39 -- common/autotest_common.sh@852 -- # return 0 00:07:26.583 12:01:39 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:26.583 12:01:39 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:26.583 12:01:39 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:26.583 12:01:39 -- common/autotest_common.sh@10 -- # set +x 00:07:26.583 { 00:07:26.583 "filename": "/tmp/spdk_mem_dump.txt" 00:07:26.583 } 00:07:26.583 12:01:39 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:26.583 12:01:39 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:26.583 DPDK memory size 814.000000 MiB in 1 heap(s) 00:07:26.583 1 heaps totaling size 814.000000 MiB 00:07:26.583 size: 814.000000 MiB heap id: 0 00:07:26.583 end heaps---------- 00:07:26.583 8 mempools totaling size 598.116089 MiB 00:07:26.583 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:26.583 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:26.583 size: 84.521057 MiB name: bdev_io_2673585 00:07:26.583 size: 51.011292 MiB name: evtpool_2673585 00:07:26.583 size: 50.003479 MiB name: msgpool_2673585 00:07:26.583 size: 21.763794 MiB name: PDU_Pool 00:07:26.583 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:26.584 size: 0.026123 MiB name: Session_Pool 00:07:26.584 end mempools------- 00:07:26.584 6 memzones totaling size 4.142822 MiB 00:07:26.584 size: 1.000366 MiB name: RG_ring_0_2673585 00:07:26.584 size: 1.000366 MiB name: RG_ring_1_2673585 00:07:26.584 size: 1.000366 MiB name: RG_ring_4_2673585 00:07:26.584 size: 1.000366 MiB name: RG_ring_5_2673585 00:07:26.584 size: 0.125366 MiB name: RG_ring_2_2673585 00:07:26.584 size: 0.015991 MiB name: RG_ring_3_2673585 00:07:26.584 end memzones------- 00:07:26.584 12:01:39 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:26.584 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:07:26.584 list of free elements. size: 12.519348 MiB 00:07:26.584 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:26.584 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:26.584 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:26.584 element at address: 0x200003e00000 with size: 0.996277 MiB 00:07:26.584 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:26.584 element at address: 0x200013800000 with size: 0.978699 MiB 00:07:26.584 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:26.584 element at address: 0x200019200000 with size: 0.936584 MiB 00:07:26.584 element at address: 0x200000200000 with size: 0.841614 MiB 00:07:26.584 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:07:26.584 element at address: 0x20000b200000 with size: 0.490723 MiB 00:07:26.584 element at address: 0x200000800000 with size: 0.487793 MiB 00:07:26.584 element at address: 0x200019400000 with size: 0.485657 MiB 00:07:26.584 element at address: 0x200027e00000 with size: 0.410034 MiB 00:07:26.584 element at address: 0x200003a00000 with size: 0.355530 MiB 00:07:26.584 list of standard malloc elements. size: 199.218079 MiB 00:07:26.584 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:26.584 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:26.584 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:26.584 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:26.584 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:26.584 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:26.584 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:26.584 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:26.584 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:07:26.584 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003adb300 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003adb500 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003affa80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003affb40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:07:26.584 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:07:26.584 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200027e69040 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:26.584 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:26.584 list of memzone associated elements. size: 602.262573 MiB 00:07:26.584 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:26.584 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:26.584 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:26.584 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:26.584 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:26.584 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2673585_0 00:07:26.584 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:26.584 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2673585_0 00:07:26.584 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:26.584 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2673585_0 00:07:26.584 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:26.584 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:26.584 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:26.584 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:26.584 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:26.584 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2673585 00:07:26.584 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:26.584 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2673585 00:07:26.584 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:26.584 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2673585 00:07:26.584 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:26.584 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:26.584 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:26.584 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:26.584 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:26.584 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:26.584 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:26.584 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:26.584 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:26.584 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2673585 00:07:26.584 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:26.584 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2673585 00:07:26.584 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:26.584 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2673585 00:07:26.584 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:26.584 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2673585 00:07:26.584 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:07:26.584 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2673585 00:07:26.584 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:07:26.584 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:26.584 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:26.584 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:26.584 element at address: 0x20001947c540 with size: 0.250488 MiB 00:07:26.584 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:26.584 element at address: 0x200003adf880 with size: 0.125488 MiB 00:07:26.584 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2673585 00:07:26.584 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:26.584 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:26.584 element at address: 0x200027e69100 with size: 0.023743 MiB 00:07:26.584 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:26.584 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:07:26.584 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2673585 00:07:26.584 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:07:26.584 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:26.584 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:07:26.584 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2673585 00:07:26.584 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:07:26.584 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2673585 00:07:26.584 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:07:26.584 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:26.584 12:01:39 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:26.584 12:01:39 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2673585 00:07:26.584 12:01:39 -- common/autotest_common.sh@926 -- # '[' -z 2673585 ']' 00:07:26.584 12:01:39 -- common/autotest_common.sh@930 -- # kill -0 2673585 00:07:26.584 12:01:39 -- common/autotest_common.sh@931 -- # uname 00:07:26.584 12:01:39 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:26.584 12:01:39 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2673585 00:07:26.584 12:01:39 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:26.584 12:01:39 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:26.584 12:01:39 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2673585' 00:07:26.584 killing process with pid 2673585 00:07:26.584 12:01:39 -- common/autotest_common.sh@945 -- # kill 2673585 00:07:26.584 12:01:39 -- common/autotest_common.sh@950 -- # wait 2673585 00:07:26.844 00:07:26.844 real 0m1.596s 00:07:26.844 user 0m1.682s 00:07:26.844 sys 0m0.530s 00:07:26.844 12:01:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.844 12:01:39 -- common/autotest_common.sh@10 -- # set +x 00:07:26.844 ************************************ 00:07:26.844 END TEST dpdk_mem_utility 00:07:26.844 ************************************ 00:07:26.844 12:01:39 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:26.844 12:01:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:26.844 12:01:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:26.844 12:01:39 -- common/autotest_common.sh@10 -- # set +x 00:07:26.844 ************************************ 00:07:26.844 START TEST event 00:07:26.844 ************************************ 00:07:26.844 12:01:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:27.104 * Looking for test storage... 00:07:27.104 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:27.104 12:01:39 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:27.104 12:01:39 -- bdev/nbd_common.sh@6 -- # set -e 00:07:27.104 12:01:39 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:27.104 12:01:39 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:27.104 12:01:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:27.104 12:01:39 -- common/autotest_common.sh@10 -- # set +x 00:07:27.104 ************************************ 00:07:27.104 START TEST event_perf 00:07:27.104 ************************************ 00:07:27.104 12:01:39 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:27.104 Running I/O for 1 seconds...[2024-06-11 12:01:39.951340] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:27.104 [2024-06-11 12:01:39.951445] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673959 ] 00:07:27.104 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.104 [2024-06-11 12:01:40.073663] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:27.104 [2024-06-11 12:01:40.126258] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.104 [2024-06-11 12:01:40.126342] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:27.104 [2024-06-11 12:01:40.126446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:27.104 [2024-06-11 12:01:40.126448] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.488 Running I/O for 1 seconds... 00:07:28.488 lcore 0: 163376 00:07:28.488 lcore 1: 163372 00:07:28.488 lcore 2: 163374 00:07:28.488 lcore 3: 163376 00:07:28.488 done. 00:07:28.488 00:07:28.488 real 0m1.265s 00:07:28.488 user 0m4.127s 00:07:28.488 sys 0m0.132s 00:07:28.488 12:01:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.488 12:01:41 -- common/autotest_common.sh@10 -- # set +x 00:07:28.488 ************************************ 00:07:28.488 END TEST event_perf 00:07:28.488 ************************************ 00:07:28.488 12:01:41 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:28.488 12:01:41 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:28.488 12:01:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.488 12:01:41 -- common/autotest_common.sh@10 -- # set +x 00:07:28.488 ************************************ 00:07:28.488 START TEST event_reactor 00:07:28.488 ************************************ 00:07:28.488 12:01:41 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:28.488 [2024-06-11 12:01:41.255370] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:28.488 [2024-06-11 12:01:41.255484] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674133 ] 00:07:28.488 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.488 [2024-06-11 12:01:41.373885] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.488 [2024-06-11 12:01:41.421453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.865 test_start 00:07:29.865 oneshot 00:07:29.865 tick 100 00:07:29.865 tick 100 00:07:29.865 tick 250 00:07:29.865 tick 100 00:07:29.865 tick 100 00:07:29.865 tick 100 00:07:29.865 tick 250 00:07:29.865 tick 500 00:07:29.865 tick 100 00:07:29.865 tick 100 00:07:29.865 tick 250 00:07:29.865 tick 100 00:07:29.865 tick 100 00:07:29.865 test_end 00:07:29.865 00:07:29.865 real 0m1.252s 00:07:29.865 user 0m1.116s 00:07:29.865 sys 0m0.130s 00:07:29.865 12:01:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.865 12:01:42 -- common/autotest_common.sh@10 -- # set +x 00:07:29.865 ************************************ 00:07:29.865 END TEST event_reactor 00:07:29.865 ************************************ 00:07:29.865 12:01:42 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:29.865 12:01:42 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:29.865 12:01:42 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:29.865 12:01:42 -- common/autotest_common.sh@10 -- # set +x 00:07:29.865 ************************************ 00:07:29.865 START TEST event_reactor_perf 00:07:29.865 ************************************ 00:07:29.865 12:01:42 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:29.865 [2024-06-11 12:01:42.549184] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:29.865 [2024-06-11 12:01:42.549271] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674296 ] 00:07:29.865 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.865 [2024-06-11 12:01:42.669721] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.865 [2024-06-11 12:01:42.717287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.801 test_start 00:07:30.801 test_end 00:07:30.801 Performance: 551228 events per second 00:07:30.801 00:07:30.801 real 0m1.256s 00:07:30.801 user 0m1.126s 00:07:30.801 sys 0m0.123s 00:07:30.801 12:01:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.801 12:01:43 -- common/autotest_common.sh@10 -- # set +x 00:07:30.801 ************************************ 00:07:30.801 END TEST event_reactor_perf 00:07:30.801 ************************************ 00:07:30.801 12:01:43 -- event/event.sh@49 -- # uname -s 00:07:30.801 12:01:43 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:30.801 12:01:43 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:30.801 12:01:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:30.801 12:01:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:30.801 12:01:43 -- common/autotest_common.sh@10 -- # set +x 00:07:31.061 ************************************ 00:07:31.061 START TEST event_scheduler 00:07:31.061 ************************************ 00:07:31.061 12:01:43 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:31.061 * Looking for test storage... 00:07:31.061 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:31.061 12:01:43 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:31.061 12:01:43 -- scheduler/scheduler.sh@35 -- # scheduler_pid=2674532 00:07:31.061 12:01:43 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:31.061 12:01:43 -- scheduler/scheduler.sh@37 -- # waitforlisten 2674532 00:07:31.061 12:01:43 -- common/autotest_common.sh@819 -- # '[' -z 2674532 ']' 00:07:31.061 12:01:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.061 12:01:43 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:31.061 12:01:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:31.061 12:01:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.061 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.061 12:01:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:31.061 12:01:43 -- common/autotest_common.sh@10 -- # set +x 00:07:31.061 [2024-06-11 12:01:43.939216] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:31.061 [2024-06-11 12:01:43.939310] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674532 ] 00:07:31.061 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.061 [2024-06-11 12:01:44.035482] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:31.061 [2024-06-11 12:01:44.083069] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.061 [2024-06-11 12:01:44.083173] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:31.061 [2024-06-11 12:01:44.083172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:31.061 [2024-06-11 12:01:44.083151] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.320 12:01:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:31.320 12:01:44 -- common/autotest_common.sh@852 -- # return 0 00:07:31.320 12:01:44 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:31.320 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.320 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.320 POWER: Env isn't set yet! 00:07:31.320 POWER: Attempting to initialise ACPI cpufreq power management... 00:07:31.320 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:07:31.320 POWER: Cannot set governor of lcore 0 to userspace 00:07:31.320 POWER: Attempting to initialise PSTAT power management... 00:07:31.320 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:07:31.320 POWER: Initialized successfully for lcore 0 power management 00:07:31.320 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:07:31.320 POWER: Initialized successfully for lcore 1 power management 00:07:31.320 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:07:31.320 POWER: Initialized successfully for lcore 2 power management 00:07:31.320 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:07:31.320 POWER: Initialized successfully for lcore 3 power management 00:07:31.320 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.320 12:01:44 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:31.320 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.320 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.320 [2024-06-11 12:01:44.277754] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:31.321 12:01:44 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:31.321 12:01:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 ************************************ 00:07:31.321 START TEST scheduler_create_thread 00:07:31.321 ************************************ 00:07:31.321 12:01:44 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 2 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 3 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 4 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 5 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 6 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.321 7 00:07:31.321 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.321 12:01:44 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:31.321 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.321 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.580 8 00:07:31.580 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.580 12:01:44 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:31.580 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.580 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.580 9 00:07:31.580 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.580 12:01:44 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:31.580 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.580 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.580 10 00:07:31.580 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.580 12:01:44 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:31.580 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.580 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.580 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.580 12:01:44 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:31.580 12:01:44 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:31.580 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.580 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:31.580 12:01:44 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:31.580 12:01:44 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:31.580 12:01:44 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:31.580 12:01:44 -- common/autotest_common.sh@10 -- # set +x 00:07:32.959 12:01:45 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:32.959 12:01:45 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:32.959 12:01:45 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:32.959 12:01:45 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:32.959 12:01:45 -- common/autotest_common.sh@10 -- # set +x 00:07:33.895 12:01:46 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:33.895 00:07:33.895 real 0m2.616s 00:07:33.895 user 0m0.023s 00:07:33.895 sys 0m0.006s 00:07:33.895 12:01:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.895 12:01:46 -- common/autotest_common.sh@10 -- # set +x 00:07:33.895 ************************************ 00:07:33.895 END TEST scheduler_create_thread 00:07:33.895 ************************************ 00:07:34.154 12:01:46 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:34.154 12:01:46 -- scheduler/scheduler.sh@46 -- # killprocess 2674532 00:07:34.154 12:01:46 -- common/autotest_common.sh@926 -- # '[' -z 2674532 ']' 00:07:34.154 12:01:46 -- common/autotest_common.sh@930 -- # kill -0 2674532 00:07:34.154 12:01:46 -- common/autotest_common.sh@931 -- # uname 00:07:34.154 12:01:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:34.154 12:01:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2674532 00:07:34.154 12:01:46 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:07:34.154 12:01:46 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:07:34.154 12:01:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2674532' 00:07:34.154 killing process with pid 2674532 00:07:34.154 12:01:46 -- common/autotest_common.sh@945 -- # kill 2674532 00:07:34.154 12:01:46 -- common/autotest_common.sh@950 -- # wait 2674532 00:07:34.413 [2024-06-11 12:01:47.383816] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:34.413 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:07:34.413 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:07:34.413 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:07:34.413 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:07:34.413 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:07:34.413 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:07:34.413 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:07:34.413 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:07:34.672 00:07:34.672 real 0m3.747s 00:07:34.672 user 0m5.790s 00:07:34.672 sys 0m0.384s 00:07:34.672 12:01:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.672 12:01:47 -- common/autotest_common.sh@10 -- # set +x 00:07:34.672 ************************************ 00:07:34.672 END TEST event_scheduler 00:07:34.672 ************************************ 00:07:34.672 12:01:47 -- event/event.sh@51 -- # modprobe -n nbd 00:07:34.672 12:01:47 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:34.672 12:01:47 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:34.672 12:01:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:34.672 12:01:47 -- common/autotest_common.sh@10 -- # set +x 00:07:34.672 ************************************ 00:07:34.672 START TEST app_repeat 00:07:34.672 ************************************ 00:07:34.672 12:01:47 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:07:34.672 12:01:47 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.672 12:01:47 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:34.672 12:01:47 -- event/event.sh@13 -- # local nbd_list 00:07:34.672 12:01:47 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:34.672 12:01:47 -- event/event.sh@14 -- # local bdev_list 00:07:34.672 12:01:47 -- event/event.sh@15 -- # local repeat_times=4 00:07:34.672 12:01:47 -- event/event.sh@17 -- # modprobe nbd 00:07:34.672 12:01:47 -- event/event.sh@19 -- # repeat_pid=2675008 00:07:34.672 12:01:47 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:34.672 12:01:47 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:34.672 12:01:47 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2675008' 00:07:34.672 Process app_repeat pid: 2675008 00:07:34.672 12:01:47 -- event/event.sh@23 -- # for i in {0..2} 00:07:34.672 12:01:47 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:34.672 spdk_app_start Round 0 00:07:34.672 12:01:47 -- event/event.sh@25 -- # waitforlisten 2675008 /var/tmp/spdk-nbd.sock 00:07:34.672 12:01:47 -- common/autotest_common.sh@819 -- # '[' -z 2675008 ']' 00:07:34.672 12:01:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:34.672 12:01:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:34.672 12:01:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:34.672 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:34.672 12:01:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:34.672 12:01:47 -- common/autotest_common.sh@10 -- # set +x 00:07:34.672 [2024-06-11 12:01:47.669681] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:34.672 [2024-06-11 12:01:47.669782] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2675008 ] 00:07:34.931 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.931 [2024-06-11 12:01:47.793654] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:34.931 [2024-06-11 12:01:47.844841] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:34.931 [2024-06-11 12:01:47.844845] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.867 12:01:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:35.867 12:01:48 -- common/autotest_common.sh@852 -- # return 0 00:07:35.867 12:01:48 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:35.867 Malloc0 00:07:35.867 12:01:48 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:36.126 Malloc1 00:07:36.126 12:01:48 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@12 -- # local i 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:36.126 12:01:48 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:36.385 /dev/nbd0 00:07:36.385 12:01:49 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:36.385 12:01:49 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:36.385 12:01:49 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:36.385 12:01:49 -- common/autotest_common.sh@857 -- # local i 00:07:36.385 12:01:49 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:36.385 12:01:49 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:36.385 12:01:49 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:36.385 12:01:49 -- common/autotest_common.sh@861 -- # break 00:07:36.385 12:01:49 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:36.385 12:01:49 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:36.385 12:01:49 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:36.385 1+0 records in 00:07:36.385 1+0 records out 00:07:36.385 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278618 s, 14.7 MB/s 00:07:36.385 12:01:49 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:36.385 12:01:49 -- common/autotest_common.sh@874 -- # size=4096 00:07:36.385 12:01:49 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:36.385 12:01:49 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:36.385 12:01:49 -- common/autotest_common.sh@877 -- # return 0 00:07:36.385 12:01:49 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.385 12:01:49 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:36.385 12:01:49 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:36.643 /dev/nbd1 00:07:36.643 12:01:49 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:36.643 12:01:49 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:36.643 12:01:49 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:36.643 12:01:49 -- common/autotest_common.sh@857 -- # local i 00:07:36.643 12:01:49 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:36.644 12:01:49 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:36.644 12:01:49 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:36.644 12:01:49 -- common/autotest_common.sh@861 -- # break 00:07:36.644 12:01:49 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:36.644 12:01:49 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:36.644 12:01:49 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:36.644 1+0 records in 00:07:36.644 1+0 records out 00:07:36.644 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277264 s, 14.8 MB/s 00:07:36.644 12:01:49 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:36.644 12:01:49 -- common/autotest_common.sh@874 -- # size=4096 00:07:36.644 12:01:49 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:36.644 12:01:49 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:36.644 12:01:49 -- common/autotest_common.sh@877 -- # return 0 00:07:36.644 12:01:49 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:36.644 12:01:49 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:36.644 12:01:49 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:36.644 12:01:49 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.644 12:01:49 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:36.903 { 00:07:36.903 "nbd_device": "/dev/nbd0", 00:07:36.903 "bdev_name": "Malloc0" 00:07:36.903 }, 00:07:36.903 { 00:07:36.903 "nbd_device": "/dev/nbd1", 00:07:36.903 "bdev_name": "Malloc1" 00:07:36.903 } 00:07:36.903 ]' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:36.903 { 00:07:36.903 "nbd_device": "/dev/nbd0", 00:07:36.903 "bdev_name": "Malloc0" 00:07:36.903 }, 00:07:36.903 { 00:07:36.903 "nbd_device": "/dev/nbd1", 00:07:36.903 "bdev_name": "Malloc1" 00:07:36.903 } 00:07:36.903 ]' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:36.903 /dev/nbd1' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:36.903 /dev/nbd1' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@65 -- # count=2 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@95 -- # count=2 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:36.903 256+0 records in 00:07:36.903 256+0 records out 00:07:36.903 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106087 s, 98.8 MB/s 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:36.903 256+0 records in 00:07:36.903 256+0 records out 00:07:36.903 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0246072 s, 42.6 MB/s 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:36.903 256+0 records in 00:07:36.903 256+0 records out 00:07:36.903 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0268718 s, 39.0 MB/s 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@51 -- # local i 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:36.903 12:01:49 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:37.162 12:01:50 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:37.162 12:01:50 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:37.162 12:01:50 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:37.162 12:01:50 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.162 12:01:50 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.162 12:01:50 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:37.421 12:01:50 -- bdev/nbd_common.sh@41 -- # break 00:07:37.421 12:01:50 -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.421 12:01:50 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:37.421 12:01:50 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:37.421 12:01:50 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@41 -- # break 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@45 -- # return 0 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@65 -- # true 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@65 -- # count=0 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@104 -- # count=0 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:37.679 12:01:50 -- bdev/nbd_common.sh@109 -- # return 0 00:07:37.679 12:01:50 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:37.938 12:01:50 -- event/event.sh@35 -- # sleep 3 00:07:38.197 [2024-06-11 12:01:51.167829] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:38.197 [2024-06-11 12:01:51.214250] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.197 [2024-06-11 12:01:51.214254] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.455 [2024-06-11 12:01:51.264775] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:38.455 [2024-06-11 12:01:51.264829] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:40.981 12:01:53 -- event/event.sh@23 -- # for i in {0..2} 00:07:40.981 12:01:53 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:40.981 spdk_app_start Round 1 00:07:40.981 12:01:53 -- event/event.sh@25 -- # waitforlisten 2675008 /var/tmp/spdk-nbd.sock 00:07:40.981 12:01:53 -- common/autotest_common.sh@819 -- # '[' -z 2675008 ']' 00:07:40.981 12:01:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:40.981 12:01:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:40.981 12:01:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:40.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:40.981 12:01:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:40.981 12:01:53 -- common/autotest_common.sh@10 -- # set +x 00:07:41.239 12:01:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:41.239 12:01:54 -- common/autotest_common.sh@852 -- # return 0 00:07:41.239 12:01:54 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:41.497 Malloc0 00:07:41.497 12:01:54 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:41.755 Malloc1 00:07:41.755 12:01:54 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@12 -- # local i 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:41.755 12:01:54 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:42.013 /dev/nbd0 00:07:42.013 12:01:54 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:42.013 12:01:54 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:42.013 12:01:54 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:42.013 12:01:54 -- common/autotest_common.sh@857 -- # local i 00:07:42.013 12:01:54 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:42.013 12:01:54 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:42.013 12:01:54 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:42.013 12:01:54 -- common/autotest_common.sh@861 -- # break 00:07:42.013 12:01:54 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:42.013 12:01:54 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:42.013 12:01:54 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:42.013 1+0 records in 00:07:42.013 1+0 records out 00:07:42.013 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241576 s, 17.0 MB/s 00:07:42.013 12:01:54 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:42.013 12:01:54 -- common/autotest_common.sh@874 -- # size=4096 00:07:42.013 12:01:54 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:42.013 12:01:54 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:42.013 12:01:54 -- common/autotest_common.sh@877 -- # return 0 00:07:42.013 12:01:54 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:42.013 12:01:54 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:42.013 12:01:54 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:42.271 /dev/nbd1 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:42.271 12:01:55 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:42.271 12:01:55 -- common/autotest_common.sh@857 -- # local i 00:07:42.271 12:01:55 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:42.271 12:01:55 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:42.271 12:01:55 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:42.271 12:01:55 -- common/autotest_common.sh@861 -- # break 00:07:42.271 12:01:55 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:42.271 12:01:55 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:42.271 12:01:55 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:42.271 1+0 records in 00:07:42.271 1+0 records out 00:07:42.271 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031665 s, 12.9 MB/s 00:07:42.271 12:01:55 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:42.271 12:01:55 -- common/autotest_common.sh@874 -- # size=4096 00:07:42.271 12:01:55 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:42.271 12:01:55 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:42.271 12:01:55 -- common/autotest_common.sh@877 -- # return 0 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.271 12:01:55 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:42.529 { 00:07:42.529 "nbd_device": "/dev/nbd0", 00:07:42.529 "bdev_name": "Malloc0" 00:07:42.529 }, 00:07:42.529 { 00:07:42.529 "nbd_device": "/dev/nbd1", 00:07:42.529 "bdev_name": "Malloc1" 00:07:42.529 } 00:07:42.529 ]' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:42.529 { 00:07:42.529 "nbd_device": "/dev/nbd0", 00:07:42.529 "bdev_name": "Malloc0" 00:07:42.529 }, 00:07:42.529 { 00:07:42.529 "nbd_device": "/dev/nbd1", 00:07:42.529 "bdev_name": "Malloc1" 00:07:42.529 } 00:07:42.529 ]' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:42.529 /dev/nbd1' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:42.529 /dev/nbd1' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@65 -- # count=2 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@95 -- # count=2 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:42.529 256+0 records in 00:07:42.529 256+0 records out 00:07:42.529 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104477 s, 100 MB/s 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:42.529 256+0 records in 00:07:42.529 256+0 records out 00:07:42.529 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199072 s, 52.7 MB/s 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:42.529 256+0 records in 00:07:42.529 256+0 records out 00:07:42.529 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.032129 s, 32.6 MB/s 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:42.529 12:01:55 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@51 -- # local i 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@41 -- # break 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.787 12:01:55 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@41 -- # break 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:43.045 12:01:56 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@65 -- # true 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@65 -- # count=0 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@104 -- # count=0 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:43.303 12:01:56 -- bdev/nbd_common.sh@109 -- # return 0 00:07:43.303 12:01:56 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:43.561 12:01:56 -- event/event.sh@35 -- # sleep 3 00:07:43.819 [2024-06-11 12:01:56.732693] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:43.819 [2024-06-11 12:01:56.776901] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:43.819 [2024-06-11 12:01:56.776905] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.819 [2024-06-11 12:01:56.821081] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:43.819 [2024-06-11 12:01:56.821132] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:47.102 12:01:59 -- event/event.sh@23 -- # for i in {0..2} 00:07:47.102 12:01:59 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:47.102 spdk_app_start Round 2 00:07:47.102 12:01:59 -- event/event.sh@25 -- # waitforlisten 2675008 /var/tmp/spdk-nbd.sock 00:07:47.102 12:01:59 -- common/autotest_common.sh@819 -- # '[' -z 2675008 ']' 00:07:47.102 12:01:59 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:47.102 12:01:59 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:47.102 12:01:59 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:47.102 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:47.102 12:01:59 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:47.102 12:01:59 -- common/autotest_common.sh@10 -- # set +x 00:07:47.102 12:01:59 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:47.102 12:01:59 -- common/autotest_common.sh@852 -- # return 0 00:07:47.102 12:01:59 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:47.102 Malloc0 00:07:47.102 12:01:59 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:47.102 Malloc1 00:07:47.102 12:02:00 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@12 -- # local i 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:47.102 12:02:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:47.359 /dev/nbd0 00:07:47.359 12:02:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:47.359 12:02:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:47.359 12:02:00 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:07:47.359 12:02:00 -- common/autotest_common.sh@857 -- # local i 00:07:47.359 12:02:00 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:47.359 12:02:00 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:47.359 12:02:00 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:07:47.359 12:02:00 -- common/autotest_common.sh@861 -- # break 00:07:47.359 12:02:00 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:47.359 12:02:00 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:47.359 12:02:00 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:47.359 1+0 records in 00:07:47.359 1+0 records out 00:07:47.359 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257108 s, 15.9 MB/s 00:07:47.359 12:02:00 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:47.359 12:02:00 -- common/autotest_common.sh@874 -- # size=4096 00:07:47.359 12:02:00 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:47.359 12:02:00 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:47.359 12:02:00 -- common/autotest_common.sh@877 -- # return 0 00:07:47.359 12:02:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:47.359 12:02:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:47.359 12:02:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:47.616 /dev/nbd1 00:07:47.616 12:02:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:47.616 12:02:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:47.616 12:02:00 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:07:47.616 12:02:00 -- common/autotest_common.sh@857 -- # local i 00:07:47.616 12:02:00 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:07:47.616 12:02:00 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:07:47.616 12:02:00 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:07:47.616 12:02:00 -- common/autotest_common.sh@861 -- # break 00:07:47.616 12:02:00 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:07:47.616 12:02:00 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:07:47.616 12:02:00 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:47.616 1+0 records in 00:07:47.616 1+0 records out 00:07:47.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241416 s, 17.0 MB/s 00:07:47.617 12:02:00 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:47.617 12:02:00 -- common/autotest_common.sh@874 -- # size=4096 00:07:47.617 12:02:00 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:47.617 12:02:00 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:07:47.617 12:02:00 -- common/autotest_common.sh@877 -- # return 0 00:07:47.617 12:02:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:47.617 12:02:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:47.617 12:02:00 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:47.617 12:02:00 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:47.617 12:02:00 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:47.874 { 00:07:47.874 "nbd_device": "/dev/nbd0", 00:07:47.874 "bdev_name": "Malloc0" 00:07:47.874 }, 00:07:47.874 { 00:07:47.874 "nbd_device": "/dev/nbd1", 00:07:47.874 "bdev_name": "Malloc1" 00:07:47.874 } 00:07:47.874 ]' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:47.874 { 00:07:47.874 "nbd_device": "/dev/nbd0", 00:07:47.874 "bdev_name": "Malloc0" 00:07:47.874 }, 00:07:47.874 { 00:07:47.874 "nbd_device": "/dev/nbd1", 00:07:47.874 "bdev_name": "Malloc1" 00:07:47.874 } 00:07:47.874 ]' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:47.874 /dev/nbd1' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:47.874 /dev/nbd1' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@65 -- # count=2 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@66 -- # echo 2 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@95 -- # count=2 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:47.874 256+0 records in 00:07:47.874 256+0 records out 00:07:47.874 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109567 s, 95.7 MB/s 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:47.874 256+0 records in 00:07:47.874 256+0 records out 00:07:47.874 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0304175 s, 34.5 MB/s 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:47.874 12:02:00 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:48.194 256+0 records in 00:07:48.194 256+0 records out 00:07:48.194 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0317124 s, 33.1 MB/s 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@51 -- # local i 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.194 12:02:00 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@41 -- # break 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:48.194 12:02:01 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@41 -- # break 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@45 -- # return 0 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:48.479 12:02:01 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@65 -- # echo '' 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@65 -- # true 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@65 -- # count=0 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@66 -- # echo 0 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@104 -- # count=0 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:48.750 12:02:01 -- bdev/nbd_common.sh@109 -- # return 0 00:07:48.750 12:02:01 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:49.008 12:02:01 -- event/event.sh@35 -- # sleep 3 00:07:49.008 [2024-06-11 12:02:02.041017] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:49.266 [2024-06-11 12:02:02.087481] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:49.266 [2024-06-11 12:02:02.087485] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.266 [2024-06-11 12:02:02.138354] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:49.266 [2024-06-11 12:02:02.138414] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:52.546 12:02:04 -- event/event.sh@38 -- # waitforlisten 2675008 /var/tmp/spdk-nbd.sock 00:07:52.546 12:02:04 -- common/autotest_common.sh@819 -- # '[' -z 2675008 ']' 00:07:52.546 12:02:04 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:52.546 12:02:04 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:52.546 12:02:04 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:52.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:52.546 12:02:04 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:52.546 12:02:04 -- common/autotest_common.sh@10 -- # set +x 00:07:52.546 12:02:05 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:52.546 12:02:05 -- common/autotest_common.sh@852 -- # return 0 00:07:52.546 12:02:05 -- event/event.sh@39 -- # killprocess 2675008 00:07:52.546 12:02:05 -- common/autotest_common.sh@926 -- # '[' -z 2675008 ']' 00:07:52.546 12:02:05 -- common/autotest_common.sh@930 -- # kill -0 2675008 00:07:52.546 12:02:05 -- common/autotest_common.sh@931 -- # uname 00:07:52.546 12:02:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:52.546 12:02:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2675008 00:07:52.546 12:02:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:52.546 12:02:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:52.546 12:02:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2675008' 00:07:52.546 killing process with pid 2675008 00:07:52.546 12:02:05 -- common/autotest_common.sh@945 -- # kill 2675008 00:07:52.546 12:02:05 -- common/autotest_common.sh@950 -- # wait 2675008 00:07:52.546 spdk_app_start is called in Round 0. 00:07:52.546 Shutdown signal received, stop current app iteration 00:07:52.546 Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 reinitialization... 00:07:52.546 spdk_app_start is called in Round 1. 00:07:52.546 Shutdown signal received, stop current app iteration 00:07:52.546 Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 reinitialization... 00:07:52.546 spdk_app_start is called in Round 2. 00:07:52.546 Shutdown signal received, stop current app iteration 00:07:52.546 Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 reinitialization... 00:07:52.546 spdk_app_start is called in Round 3. 00:07:52.546 Shutdown signal received, stop current app iteration 00:07:52.546 12:02:05 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:52.546 12:02:05 -- event/event.sh@42 -- # return 0 00:07:52.546 00:07:52.546 real 0m17.592s 00:07:52.546 user 0m37.613s 00:07:52.546 sys 0m3.775s 00:07:52.546 12:02:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.546 12:02:05 -- common/autotest_common.sh@10 -- # set +x 00:07:52.546 ************************************ 00:07:52.546 END TEST app_repeat 00:07:52.546 ************************************ 00:07:52.546 12:02:05 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:52.546 12:02:05 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:52.546 12:02:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:52.546 12:02:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:52.546 12:02:05 -- common/autotest_common.sh@10 -- # set +x 00:07:52.546 ************************************ 00:07:52.546 START TEST cpu_locks 00:07:52.546 ************************************ 00:07:52.546 12:02:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:52.546 * Looking for test storage... 00:07:52.546 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:52.546 12:02:05 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:52.546 12:02:05 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:52.546 12:02:05 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:52.546 12:02:05 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:52.546 12:02:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:52.546 12:02:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:52.546 12:02:05 -- common/autotest_common.sh@10 -- # set +x 00:07:52.546 ************************************ 00:07:52.546 START TEST default_locks 00:07:52.546 ************************************ 00:07:52.546 12:02:05 -- common/autotest_common.sh@1104 -- # default_locks 00:07:52.546 12:02:05 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=2677618 00:07:52.546 12:02:05 -- event/cpu_locks.sh@47 -- # waitforlisten 2677618 00:07:52.546 12:02:05 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:52.546 12:02:05 -- common/autotest_common.sh@819 -- # '[' -z 2677618 ']' 00:07:52.546 12:02:05 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:52.546 12:02:05 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:52.546 12:02:05 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:52.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:52.546 12:02:05 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:52.546 12:02:05 -- common/autotest_common.sh@10 -- # set +x 00:07:52.546 [2024-06-11 12:02:05.421142] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:52.546 [2024-06-11 12:02:05.421237] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2677618 ] 00:07:52.546 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.546 [2024-06-11 12:02:05.541554] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.805 [2024-06-11 12:02:05.587604] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.805 [2024-06-11 12:02:05.587751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.372 12:02:06 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:53.373 12:02:06 -- common/autotest_common.sh@852 -- # return 0 00:07:53.373 12:02:06 -- event/cpu_locks.sh@49 -- # locks_exist 2677618 00:07:53.373 12:02:06 -- event/cpu_locks.sh@22 -- # lslocks -p 2677618 00:07:53.373 12:02:06 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:54.308 lslocks: write error 00:07:54.308 12:02:07 -- event/cpu_locks.sh@50 -- # killprocess 2677618 00:07:54.308 12:02:07 -- common/autotest_common.sh@926 -- # '[' -z 2677618 ']' 00:07:54.308 12:02:07 -- common/autotest_common.sh@930 -- # kill -0 2677618 00:07:54.308 12:02:07 -- common/autotest_common.sh@931 -- # uname 00:07:54.308 12:02:07 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:54.308 12:02:07 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2677618 00:07:54.308 12:02:07 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:54.308 12:02:07 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:54.308 12:02:07 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2677618' 00:07:54.308 killing process with pid 2677618 00:07:54.308 12:02:07 -- common/autotest_common.sh@945 -- # kill 2677618 00:07:54.308 12:02:07 -- common/autotest_common.sh@950 -- # wait 2677618 00:07:54.568 12:02:07 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 2677618 00:07:54.568 12:02:07 -- common/autotest_common.sh@640 -- # local es=0 00:07:54.568 12:02:07 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2677618 00:07:54.568 12:02:07 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:07:54.568 12:02:07 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:54.568 12:02:07 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:07:54.568 12:02:07 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:54.568 12:02:07 -- common/autotest_common.sh@643 -- # waitforlisten 2677618 00:07:54.568 12:02:07 -- common/autotest_common.sh@819 -- # '[' -z 2677618 ']' 00:07:54.568 12:02:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:54.568 12:02:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:54.568 12:02:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:54.568 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:54.568 12:02:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:54.568 12:02:07 -- common/autotest_common.sh@10 -- # set +x 00:07:54.568 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2677618) - No such process 00:07:54.568 ERROR: process (pid: 2677618) is no longer running 00:07:54.568 12:02:07 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:54.568 12:02:07 -- common/autotest_common.sh@852 -- # return 1 00:07:54.568 12:02:07 -- common/autotest_common.sh@643 -- # es=1 00:07:54.568 12:02:07 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:54.568 12:02:07 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:54.568 12:02:07 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:54.568 12:02:07 -- event/cpu_locks.sh@54 -- # no_locks 00:07:54.568 12:02:07 -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:54.568 12:02:07 -- event/cpu_locks.sh@26 -- # local lock_files 00:07:54.568 12:02:07 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:54.568 00:07:54.568 real 0m2.017s 00:07:54.568 user 0m2.101s 00:07:54.568 sys 0m0.781s 00:07:54.568 12:02:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.568 12:02:07 -- common/autotest_common.sh@10 -- # set +x 00:07:54.568 ************************************ 00:07:54.568 END TEST default_locks 00:07:54.568 ************************************ 00:07:54.568 12:02:07 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:54.568 12:02:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:54.568 12:02:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:54.568 12:02:07 -- common/autotest_common.sh@10 -- # set +x 00:07:54.568 ************************************ 00:07:54.568 START TEST default_locks_via_rpc 00:07:54.568 ************************************ 00:07:54.568 12:02:07 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:07:54.568 12:02:07 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=2677960 00:07:54.568 12:02:07 -- event/cpu_locks.sh@63 -- # waitforlisten 2677960 00:07:54.568 12:02:07 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:54.568 12:02:07 -- common/autotest_common.sh@819 -- # '[' -z 2677960 ']' 00:07:54.568 12:02:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:54.568 12:02:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:54.568 12:02:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:54.568 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:54.568 12:02:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:54.568 12:02:07 -- common/autotest_common.sh@10 -- # set +x 00:07:54.568 [2024-06-11 12:02:07.491492] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:54.568 [2024-06-11 12:02:07.491597] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2677960 ] 00:07:54.568 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.827 [2024-06-11 12:02:07.615400] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.827 [2024-06-11 12:02:07.664230] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.827 [2024-06-11 12:02:07.664394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.396 12:02:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:55.396 12:02:08 -- common/autotest_common.sh@852 -- # return 0 00:07:55.396 12:02:08 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:55.396 12:02:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:55.396 12:02:08 -- common/autotest_common.sh@10 -- # set +x 00:07:55.396 12:02:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:55.396 12:02:08 -- event/cpu_locks.sh@67 -- # no_locks 00:07:55.396 12:02:08 -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:55.396 12:02:08 -- event/cpu_locks.sh@26 -- # local lock_files 00:07:55.396 12:02:08 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:55.396 12:02:08 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:55.396 12:02:08 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:55.396 12:02:08 -- common/autotest_common.sh@10 -- # set +x 00:07:55.396 12:02:08 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:55.396 12:02:08 -- event/cpu_locks.sh@71 -- # locks_exist 2677960 00:07:55.396 12:02:08 -- event/cpu_locks.sh@22 -- # lslocks -p 2677960 00:07:55.396 12:02:08 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:55.963 12:02:08 -- event/cpu_locks.sh@73 -- # killprocess 2677960 00:07:55.963 12:02:08 -- common/autotest_common.sh@926 -- # '[' -z 2677960 ']' 00:07:55.963 12:02:08 -- common/autotest_common.sh@930 -- # kill -0 2677960 00:07:55.963 12:02:08 -- common/autotest_common.sh@931 -- # uname 00:07:55.963 12:02:08 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:55.963 12:02:08 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2677960 00:07:55.963 12:02:08 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:55.963 12:02:08 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:55.963 12:02:08 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2677960' 00:07:55.963 killing process with pid 2677960 00:07:55.963 12:02:08 -- common/autotest_common.sh@945 -- # kill 2677960 00:07:55.963 12:02:08 -- common/autotest_common.sh@950 -- # wait 2677960 00:07:56.531 00:07:56.531 real 0m1.797s 00:07:56.531 user 0m1.851s 00:07:56.531 sys 0m0.695s 00:07:56.531 12:02:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:56.531 12:02:09 -- common/autotest_common.sh@10 -- # set +x 00:07:56.531 ************************************ 00:07:56.531 END TEST default_locks_via_rpc 00:07:56.531 ************************************ 00:07:56.531 12:02:09 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:56.531 12:02:09 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:56.531 12:02:09 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:56.531 12:02:09 -- common/autotest_common.sh@10 -- # set +x 00:07:56.531 ************************************ 00:07:56.531 START TEST non_locking_app_on_locked_coremask 00:07:56.531 ************************************ 00:07:56.531 12:02:09 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:07:56.531 12:02:09 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=2678196 00:07:56.531 12:02:09 -- event/cpu_locks.sh@81 -- # waitforlisten 2678196 /var/tmp/spdk.sock 00:07:56.531 12:02:09 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:56.531 12:02:09 -- common/autotest_common.sh@819 -- # '[' -z 2678196 ']' 00:07:56.531 12:02:09 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:56.531 12:02:09 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:56.531 12:02:09 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:56.531 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:56.531 12:02:09 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:56.531 12:02:09 -- common/autotest_common.sh@10 -- # set +x 00:07:56.531 [2024-06-11 12:02:09.337987] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:56.531 [2024-06-11 12:02:09.338063] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2678196 ] 00:07:56.531 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.531 [2024-06-11 12:02:09.457301] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.531 [2024-06-11 12:02:09.501987] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.531 [2024-06-11 12:02:09.502135] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.469 12:02:10 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:57.469 12:02:10 -- common/autotest_common.sh@852 -- # return 0 00:07:57.469 12:02:10 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=2678355 00:07:57.469 12:02:10 -- event/cpu_locks.sh@85 -- # waitforlisten 2678355 /var/tmp/spdk2.sock 00:07:57.469 12:02:10 -- common/autotest_common.sh@819 -- # '[' -z 2678355 ']' 00:07:57.469 12:02:10 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:57.469 12:02:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:57.469 12:02:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:57.469 12:02:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:57.469 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:57.469 12:02:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:57.469 12:02:10 -- common/autotest_common.sh@10 -- # set +x 00:07:57.469 [2024-06-11 12:02:10.329076] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:07:57.469 [2024-06-11 12:02:10.329167] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2678355 ] 00:07:57.469 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.469 [2024-06-11 12:02:10.485303] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:57.469 [2024-06-11 12:02:10.485333] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.727 [2024-06-11 12:02:10.574072] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.727 [2024-06-11 12:02:10.574217] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.294 12:02:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:58.294 12:02:11 -- common/autotest_common.sh@852 -- # return 0 00:07:58.294 12:02:11 -- event/cpu_locks.sh@87 -- # locks_exist 2678196 00:07:58.294 12:02:11 -- event/cpu_locks.sh@22 -- # lslocks -p 2678196 00:07:58.294 12:02:11 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:00.198 lslocks: write error 00:08:00.198 12:02:12 -- event/cpu_locks.sh@89 -- # killprocess 2678196 00:08:00.198 12:02:12 -- common/autotest_common.sh@926 -- # '[' -z 2678196 ']' 00:08:00.198 12:02:12 -- common/autotest_common.sh@930 -- # kill -0 2678196 00:08:00.198 12:02:12 -- common/autotest_common.sh@931 -- # uname 00:08:00.198 12:02:12 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:00.198 12:02:12 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2678196 00:08:00.198 12:02:12 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:00.198 12:02:12 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:00.198 12:02:12 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2678196' 00:08:00.198 killing process with pid 2678196 00:08:00.198 12:02:12 -- common/autotest_common.sh@945 -- # kill 2678196 00:08:00.198 12:02:12 -- common/autotest_common.sh@950 -- # wait 2678196 00:08:00.457 12:02:13 -- event/cpu_locks.sh@90 -- # killprocess 2678355 00:08:00.458 12:02:13 -- common/autotest_common.sh@926 -- # '[' -z 2678355 ']' 00:08:00.458 12:02:13 -- common/autotest_common.sh@930 -- # kill -0 2678355 00:08:00.458 12:02:13 -- common/autotest_common.sh@931 -- # uname 00:08:00.458 12:02:13 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:00.458 12:02:13 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2678355 00:08:00.458 12:02:13 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:00.458 12:02:13 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:00.458 12:02:13 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2678355' 00:08:00.458 killing process with pid 2678355 00:08:00.458 12:02:13 -- common/autotest_common.sh@945 -- # kill 2678355 00:08:00.458 12:02:13 -- common/autotest_common.sh@950 -- # wait 2678355 00:08:01.026 00:08:01.026 real 0m4.517s 00:08:01.026 user 0m4.960s 00:08:01.026 sys 0m1.507s 00:08:01.026 12:02:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.026 12:02:13 -- common/autotest_common.sh@10 -- # set +x 00:08:01.026 ************************************ 00:08:01.026 END TEST non_locking_app_on_locked_coremask 00:08:01.026 ************************************ 00:08:01.026 12:02:13 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:08:01.026 12:02:13 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:01.026 12:02:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:01.026 12:02:13 -- common/autotest_common.sh@10 -- # set +x 00:08:01.026 ************************************ 00:08:01.026 START TEST locking_app_on_unlocked_coremask 00:08:01.026 ************************************ 00:08:01.026 12:02:13 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:08:01.026 12:02:13 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=2678917 00:08:01.026 12:02:13 -- event/cpu_locks.sh@99 -- # waitforlisten 2678917 /var/tmp/spdk.sock 00:08:01.026 12:02:13 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:08:01.026 12:02:13 -- common/autotest_common.sh@819 -- # '[' -z 2678917 ']' 00:08:01.026 12:02:13 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:01.026 12:02:13 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:01.026 12:02:13 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:01.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:01.026 12:02:13 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:01.026 12:02:13 -- common/autotest_common.sh@10 -- # set +x 00:08:01.026 [2024-06-11 12:02:13.907945] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:01.026 [2024-06-11 12:02:13.908029] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2678917 ] 00:08:01.026 EAL: No free 2048 kB hugepages reported on node 1 00:08:01.026 [2024-06-11 12:02:14.029793] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:01.026 [2024-06-11 12:02:14.029833] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.285 [2024-06-11 12:02:14.079487] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:01.285 [2024-06-11 12:02:14.079642] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.853 12:02:14 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:01.853 12:02:14 -- common/autotest_common.sh@852 -- # return 0 00:08:01.853 12:02:14 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=2679038 00:08:01.853 12:02:14 -- event/cpu_locks.sh@103 -- # waitforlisten 2679038 /var/tmp/spdk2.sock 00:08:01.853 12:02:14 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:08:01.853 12:02:14 -- common/autotest_common.sh@819 -- # '[' -z 2679038 ']' 00:08:01.853 12:02:14 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:01.853 12:02:14 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:01.853 12:02:14 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:01.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:01.853 12:02:14 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:01.853 12:02:14 -- common/autotest_common.sh@10 -- # set +x 00:08:02.112 [2024-06-11 12:02:14.886805] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:02.112 [2024-06-11 12:02:14.886915] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2679038 ] 00:08:02.112 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.112 [2024-06-11 12:02:15.046158] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.112 [2024-06-11 12:02:15.138395] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.112 [2024-06-11 12:02:15.138550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.050 12:02:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:03.050 12:02:15 -- common/autotest_common.sh@852 -- # return 0 00:08:03.050 12:02:15 -- event/cpu_locks.sh@105 -- # locks_exist 2679038 00:08:03.050 12:02:15 -- event/cpu_locks.sh@22 -- # lslocks -p 2679038 00:08:03.050 12:02:15 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:04.428 lslocks: write error 00:08:04.428 12:02:17 -- event/cpu_locks.sh@107 -- # killprocess 2678917 00:08:04.428 12:02:17 -- common/autotest_common.sh@926 -- # '[' -z 2678917 ']' 00:08:04.428 12:02:17 -- common/autotest_common.sh@930 -- # kill -0 2678917 00:08:04.428 12:02:17 -- common/autotest_common.sh@931 -- # uname 00:08:04.428 12:02:17 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:04.428 12:02:17 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2678917 00:08:04.428 12:02:17 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:04.428 12:02:17 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:04.428 12:02:17 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2678917' 00:08:04.428 killing process with pid 2678917 00:08:04.428 12:02:17 -- common/autotest_common.sh@945 -- # kill 2678917 00:08:04.428 12:02:17 -- common/autotest_common.sh@950 -- # wait 2678917 00:08:04.997 12:02:17 -- event/cpu_locks.sh@108 -- # killprocess 2679038 00:08:04.997 12:02:17 -- common/autotest_common.sh@926 -- # '[' -z 2679038 ']' 00:08:04.997 12:02:17 -- common/autotest_common.sh@930 -- # kill -0 2679038 00:08:04.997 12:02:17 -- common/autotest_common.sh@931 -- # uname 00:08:04.997 12:02:17 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:04.997 12:02:17 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2679038 00:08:04.997 12:02:17 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:04.997 12:02:17 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:04.997 12:02:17 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2679038' 00:08:04.997 killing process with pid 2679038 00:08:04.997 12:02:17 -- common/autotest_common.sh@945 -- # kill 2679038 00:08:04.997 12:02:17 -- common/autotest_common.sh@950 -- # wait 2679038 00:08:05.256 00:08:05.256 real 0m4.377s 00:08:05.256 user 0m4.768s 00:08:05.256 sys 0m1.491s 00:08:05.256 12:02:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.256 12:02:18 -- common/autotest_common.sh@10 -- # set +x 00:08:05.256 ************************************ 00:08:05.256 END TEST locking_app_on_unlocked_coremask 00:08:05.256 ************************************ 00:08:05.515 12:02:18 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:08:05.515 12:02:18 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:05.515 12:02:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:05.515 12:02:18 -- common/autotest_common.sh@10 -- # set +x 00:08:05.515 ************************************ 00:08:05.515 START TEST locking_app_on_locked_coremask 00:08:05.515 ************************************ 00:08:05.515 12:02:18 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:08:05.515 12:02:18 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=2679503 00:08:05.515 12:02:18 -- event/cpu_locks.sh@116 -- # waitforlisten 2679503 /var/tmp/spdk.sock 00:08:05.515 12:02:18 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:05.515 12:02:18 -- common/autotest_common.sh@819 -- # '[' -z 2679503 ']' 00:08:05.515 12:02:18 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:05.515 12:02:18 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:05.515 12:02:18 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:05.515 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:05.515 12:02:18 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:05.515 12:02:18 -- common/autotest_common.sh@10 -- # set +x 00:08:05.515 [2024-06-11 12:02:18.331531] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:05.515 [2024-06-11 12:02:18.331598] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2679503 ] 00:08:05.515 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.515 [2024-06-11 12:02:18.448200] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.515 [2024-06-11 12:02:18.492787] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.515 [2024-06-11 12:02:18.492937] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.453 12:02:19 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:06.453 12:02:19 -- common/autotest_common.sh@852 -- # return 0 00:08:06.453 12:02:19 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:08:06.453 12:02:19 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=2679680 00:08:06.453 12:02:19 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 2679680 /var/tmp/spdk2.sock 00:08:06.453 12:02:19 -- common/autotest_common.sh@640 -- # local es=0 00:08:06.453 12:02:19 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2679680 /var/tmp/spdk2.sock 00:08:06.453 12:02:19 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:08:06.453 12:02:19 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:06.453 12:02:19 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:08:06.453 12:02:19 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:06.453 12:02:19 -- common/autotest_common.sh@643 -- # waitforlisten 2679680 /var/tmp/spdk2.sock 00:08:06.453 12:02:19 -- common/autotest_common.sh@819 -- # '[' -z 2679680 ']' 00:08:06.453 12:02:19 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:06.453 12:02:19 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:06.453 12:02:19 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:06.453 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:06.453 12:02:19 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:06.453 12:02:19 -- common/autotest_common.sh@10 -- # set +x 00:08:06.453 [2024-06-11 12:02:19.296862] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:06.453 [2024-06-11 12:02:19.296940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2679680 ] 00:08:06.453 EAL: No free 2048 kB hugepages reported on node 1 00:08:06.453 [2024-06-11 12:02:19.454781] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 2679503 has claimed it. 00:08:06.453 [2024-06-11 12:02:19.454827] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:08:07.021 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2679680) - No such process 00:08:07.021 ERROR: process (pid: 2679680) is no longer running 00:08:07.021 12:02:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:07.021 12:02:20 -- common/autotest_common.sh@852 -- # return 1 00:08:07.021 12:02:20 -- common/autotest_common.sh@643 -- # es=1 00:08:07.021 12:02:20 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:07.021 12:02:20 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:07.021 12:02:20 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:07.021 12:02:20 -- event/cpu_locks.sh@122 -- # locks_exist 2679503 00:08:07.021 12:02:20 -- event/cpu_locks.sh@22 -- # lslocks -p 2679503 00:08:07.021 12:02:20 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:08:07.588 lslocks: write error 00:08:07.588 12:02:20 -- event/cpu_locks.sh@124 -- # killprocess 2679503 00:08:07.588 12:02:20 -- common/autotest_common.sh@926 -- # '[' -z 2679503 ']' 00:08:07.588 12:02:20 -- common/autotest_common.sh@930 -- # kill -0 2679503 00:08:07.588 12:02:20 -- common/autotest_common.sh@931 -- # uname 00:08:07.588 12:02:20 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:07.588 12:02:20 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2679503 00:08:07.588 12:02:20 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:07.588 12:02:20 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:07.588 12:02:20 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2679503' 00:08:07.588 killing process with pid 2679503 00:08:07.588 12:02:20 -- common/autotest_common.sh@945 -- # kill 2679503 00:08:07.588 12:02:20 -- common/autotest_common.sh@950 -- # wait 2679503 00:08:07.847 00:08:07.847 real 0m2.509s 00:08:07.847 user 0m2.776s 00:08:07.847 sys 0m0.796s 00:08:07.847 12:02:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.847 12:02:20 -- common/autotest_common.sh@10 -- # set +x 00:08:07.847 ************************************ 00:08:07.847 END TEST locking_app_on_locked_coremask 00:08:07.847 ************************************ 00:08:07.847 12:02:20 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:08:07.847 12:02:20 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:07.847 12:02:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:07.847 12:02:20 -- common/autotest_common.sh@10 -- # set +x 00:08:07.847 ************************************ 00:08:07.847 START TEST locking_overlapped_coremask 00:08:07.847 ************************************ 00:08:07.847 12:02:20 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:08:07.847 12:02:20 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=2679891 00:08:07.847 12:02:20 -- event/cpu_locks.sh@133 -- # waitforlisten 2679891 /var/tmp/spdk.sock 00:08:07.847 12:02:20 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:08:07.847 12:02:20 -- common/autotest_common.sh@819 -- # '[' -z 2679891 ']' 00:08:07.847 12:02:20 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:07.847 12:02:20 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:07.847 12:02:20 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:07.847 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:07.847 12:02:20 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:07.847 12:02:20 -- common/autotest_common.sh@10 -- # set +x 00:08:08.106 [2024-06-11 12:02:20.885698] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:08.106 [2024-06-11 12:02:20.885775] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2679891 ] 00:08:08.106 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.106 [2024-06-11 12:02:20.996160] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:08.106 [2024-06-11 12:02:21.043491] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.106 [2024-06-11 12:02:21.043670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:08.106 [2024-06-11 12:02:21.046377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:08.106 [2024-06-11 12:02:21.046382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.041 12:02:21 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:09.041 12:02:21 -- common/autotest_common.sh@852 -- # return 0 00:08:09.041 12:02:21 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=2680071 00:08:09.041 12:02:21 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 2680071 /var/tmp/spdk2.sock 00:08:09.041 12:02:21 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:08:09.041 12:02:21 -- common/autotest_common.sh@640 -- # local es=0 00:08:09.041 12:02:21 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 2680071 /var/tmp/spdk2.sock 00:08:09.041 12:02:21 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:08:09.041 12:02:21 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:09.041 12:02:21 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:08:09.041 12:02:21 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:09.041 12:02:21 -- common/autotest_common.sh@643 -- # waitforlisten 2680071 /var/tmp/spdk2.sock 00:08:09.041 12:02:21 -- common/autotest_common.sh@819 -- # '[' -z 2680071 ']' 00:08:09.041 12:02:21 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:09.041 12:02:21 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:09.041 12:02:21 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:09.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:09.041 12:02:21 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:09.041 12:02:21 -- common/autotest_common.sh@10 -- # set +x 00:08:09.041 [2024-06-11 12:02:21.831089] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:09.041 [2024-06-11 12:02:21.831162] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2680071 ] 00:08:09.041 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.041 [2024-06-11 12:02:21.943687] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2679891 has claimed it. 00:08:09.041 [2024-06-11 12:02:21.943724] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:08:09.607 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (2680071) - No such process 00:08:09.607 ERROR: process (pid: 2680071) is no longer running 00:08:09.607 12:02:22 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:09.607 12:02:22 -- common/autotest_common.sh@852 -- # return 1 00:08:09.607 12:02:22 -- common/autotest_common.sh@643 -- # es=1 00:08:09.607 12:02:22 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:09.607 12:02:22 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:09.607 12:02:22 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:09.607 12:02:22 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:08:09.607 12:02:22 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:08:09.607 12:02:22 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:08:09.607 12:02:22 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:08:09.607 12:02:22 -- event/cpu_locks.sh@141 -- # killprocess 2679891 00:08:09.607 12:02:22 -- common/autotest_common.sh@926 -- # '[' -z 2679891 ']' 00:08:09.607 12:02:22 -- common/autotest_common.sh@930 -- # kill -0 2679891 00:08:09.607 12:02:22 -- common/autotest_common.sh@931 -- # uname 00:08:09.607 12:02:22 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:09.607 12:02:22 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2679891 00:08:09.607 12:02:22 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:09.607 12:02:22 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:09.607 12:02:22 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2679891' 00:08:09.607 killing process with pid 2679891 00:08:09.607 12:02:22 -- common/autotest_common.sh@945 -- # kill 2679891 00:08:09.607 12:02:22 -- common/autotest_common.sh@950 -- # wait 2679891 00:08:10.174 00:08:10.174 real 0m2.093s 00:08:10.174 user 0m5.939s 00:08:10.174 sys 0m0.556s 00:08:10.174 12:02:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.174 12:02:22 -- common/autotest_common.sh@10 -- # set +x 00:08:10.174 ************************************ 00:08:10.174 END TEST locking_overlapped_coremask 00:08:10.174 ************************************ 00:08:10.174 12:02:22 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:08:10.174 12:02:23 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:10.174 12:02:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:10.174 12:02:23 -- common/autotest_common.sh@10 -- # set +x 00:08:10.174 ************************************ 00:08:10.174 START TEST locking_overlapped_coremask_via_rpc 00:08:10.174 ************************************ 00:08:10.174 12:02:23 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:08:10.174 12:02:23 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=2680233 00:08:10.174 12:02:23 -- event/cpu_locks.sh@149 -- # waitforlisten 2680233 /var/tmp/spdk.sock 00:08:10.174 12:02:23 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:08:10.174 12:02:23 -- common/autotest_common.sh@819 -- # '[' -z 2680233 ']' 00:08:10.174 12:02:23 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:10.174 12:02:23 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:10.174 12:02:23 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:10.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:10.174 12:02:23 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:10.174 12:02:23 -- common/autotest_common.sh@10 -- # set +x 00:08:10.174 [2024-06-11 12:02:23.034969] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:10.174 [2024-06-11 12:02:23.035071] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2680233 ] 00:08:10.174 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.174 [2024-06-11 12:02:23.158405] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:10.174 [2024-06-11 12:02:23.158439] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:10.174 [2024-06-11 12:02:23.204411] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.174 [2024-06-11 12:02:23.204608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:10.174 [2024-06-11 12:02:23.204694] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:10.174 [2024-06-11 12:02:23.204698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.109 12:02:23 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:11.109 12:02:23 -- common/autotest_common.sh@852 -- # return 0 00:08:11.109 12:02:23 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=2680296 00:08:11.109 12:02:23 -- event/cpu_locks.sh@153 -- # waitforlisten 2680296 /var/tmp/spdk2.sock 00:08:11.109 12:02:23 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:08:11.109 12:02:23 -- common/autotest_common.sh@819 -- # '[' -z 2680296 ']' 00:08:11.109 12:02:23 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:11.109 12:02:23 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:11.109 12:02:23 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:11.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:11.109 12:02:23 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:11.109 12:02:23 -- common/autotest_common.sh@10 -- # set +x 00:08:11.109 [2024-06-11 12:02:24.021217] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:11.109 [2024-06-11 12:02:24.021294] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2680296 ] 00:08:11.109 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.367 [2024-06-11 12:02:24.153205] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:08:11.367 [2024-06-11 12:02:24.153237] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:11.367 [2024-06-11 12:02:24.236197] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:11.367 [2024-06-11 12:02:24.236395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:11.367 [2024-06-11 12:02:24.236454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:11.367 [2024-06-11 12:02:24.236456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:08:12.303 12:02:24 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:12.303 12:02:24 -- common/autotest_common.sh@852 -- # return 0 00:08:12.303 12:02:24 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:08:12.303 12:02:24 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:12.303 12:02:24 -- common/autotest_common.sh@10 -- # set +x 00:08:12.303 12:02:24 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:12.303 12:02:24 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:12.303 12:02:24 -- common/autotest_common.sh@640 -- # local es=0 00:08:12.303 12:02:24 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:12.303 12:02:24 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:08:12.303 12:02:24 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:12.303 12:02:24 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:08:12.303 12:02:24 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:12.303 12:02:24 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:08:12.303 12:02:24 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:12.303 12:02:24 -- common/autotest_common.sh@10 -- # set +x 00:08:12.303 [2024-06-11 12:02:24.994431] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 2680233 has claimed it. 00:08:12.303 request: 00:08:12.303 { 00:08:12.303 "method": "framework_enable_cpumask_locks", 00:08:12.303 "req_id": 1 00:08:12.303 } 00:08:12.303 Got JSON-RPC error response 00:08:12.303 response: 00:08:12.303 { 00:08:12.303 "code": -32603, 00:08:12.303 "message": "Failed to claim CPU core: 2" 00:08:12.303 } 00:08:12.303 12:02:25 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:08:12.303 12:02:25 -- common/autotest_common.sh@643 -- # es=1 00:08:12.303 12:02:25 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:12.303 12:02:25 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:12.303 12:02:25 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:12.303 12:02:25 -- event/cpu_locks.sh@158 -- # waitforlisten 2680233 /var/tmp/spdk.sock 00:08:12.303 12:02:25 -- common/autotest_common.sh@819 -- # '[' -z 2680233 ']' 00:08:12.303 12:02:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:12.303 12:02:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:12.303 12:02:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:12.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:12.303 12:02:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:12.303 12:02:25 -- common/autotest_common.sh@10 -- # set +x 00:08:12.303 12:02:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:12.303 12:02:25 -- common/autotest_common.sh@852 -- # return 0 00:08:12.303 12:02:25 -- event/cpu_locks.sh@159 -- # waitforlisten 2680296 /var/tmp/spdk2.sock 00:08:12.303 12:02:25 -- common/autotest_common.sh@819 -- # '[' -z 2680296 ']' 00:08:12.303 12:02:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:08:12.303 12:02:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:12.303 12:02:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:08:12.303 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:08:12.303 12:02:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:12.303 12:02:25 -- common/autotest_common.sh@10 -- # set +x 00:08:12.562 12:02:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:12.562 12:02:25 -- common/autotest_common.sh@852 -- # return 0 00:08:12.562 12:02:25 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:08:12.562 12:02:25 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:08:12.562 12:02:25 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:08:12.562 12:02:25 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:08:12.562 00:08:12.562 real 0m2.494s 00:08:12.562 user 0m1.169s 00:08:12.562 sys 0m0.251s 00:08:12.562 12:02:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.562 12:02:25 -- common/autotest_common.sh@10 -- # set +x 00:08:12.562 ************************************ 00:08:12.562 END TEST locking_overlapped_coremask_via_rpc 00:08:12.562 ************************************ 00:08:12.562 12:02:25 -- event/cpu_locks.sh@174 -- # cleanup 00:08:12.562 12:02:25 -- event/cpu_locks.sh@15 -- # [[ -z 2680233 ]] 00:08:12.562 12:02:25 -- event/cpu_locks.sh@15 -- # killprocess 2680233 00:08:12.562 12:02:25 -- common/autotest_common.sh@926 -- # '[' -z 2680233 ']' 00:08:12.562 12:02:25 -- common/autotest_common.sh@930 -- # kill -0 2680233 00:08:12.562 12:02:25 -- common/autotest_common.sh@931 -- # uname 00:08:12.562 12:02:25 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:12.562 12:02:25 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2680233 00:08:12.821 12:02:25 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:12.821 12:02:25 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:12.821 12:02:25 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2680233' 00:08:12.821 killing process with pid 2680233 00:08:12.821 12:02:25 -- common/autotest_common.sh@945 -- # kill 2680233 00:08:12.821 12:02:25 -- common/autotest_common.sh@950 -- # wait 2680233 00:08:13.080 12:02:25 -- event/cpu_locks.sh@16 -- # [[ -z 2680296 ]] 00:08:13.080 12:02:25 -- event/cpu_locks.sh@16 -- # killprocess 2680296 00:08:13.080 12:02:25 -- common/autotest_common.sh@926 -- # '[' -z 2680296 ']' 00:08:13.080 12:02:25 -- common/autotest_common.sh@930 -- # kill -0 2680296 00:08:13.080 12:02:25 -- common/autotest_common.sh@931 -- # uname 00:08:13.080 12:02:25 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:13.080 12:02:25 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2680296 00:08:13.080 12:02:26 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:08:13.080 12:02:26 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:08:13.080 12:02:26 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2680296' 00:08:13.080 killing process with pid 2680296 00:08:13.080 12:02:26 -- common/autotest_common.sh@945 -- # kill 2680296 00:08:13.080 12:02:26 -- common/autotest_common.sh@950 -- # wait 2680296 00:08:13.339 12:02:26 -- event/cpu_locks.sh@18 -- # rm -f 00:08:13.339 12:02:26 -- event/cpu_locks.sh@1 -- # cleanup 00:08:13.339 12:02:26 -- event/cpu_locks.sh@15 -- # [[ -z 2680233 ]] 00:08:13.339 12:02:26 -- event/cpu_locks.sh@15 -- # killprocess 2680233 00:08:13.339 12:02:26 -- common/autotest_common.sh@926 -- # '[' -z 2680233 ']' 00:08:13.339 12:02:26 -- common/autotest_common.sh@930 -- # kill -0 2680233 00:08:13.339 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2680233) - No such process 00:08:13.339 12:02:26 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2680233 is not found' 00:08:13.339 Process with pid 2680233 is not found 00:08:13.339 12:02:26 -- event/cpu_locks.sh@16 -- # [[ -z 2680296 ]] 00:08:13.339 12:02:26 -- event/cpu_locks.sh@16 -- # killprocess 2680296 00:08:13.339 12:02:26 -- common/autotest_common.sh@926 -- # '[' -z 2680296 ']' 00:08:13.339 12:02:26 -- common/autotest_common.sh@930 -- # kill -0 2680296 00:08:13.339 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (2680296) - No such process 00:08:13.339 12:02:26 -- common/autotest_common.sh@953 -- # echo 'Process with pid 2680296 is not found' 00:08:13.339 Process with pid 2680296 is not found 00:08:13.339 12:02:26 -- event/cpu_locks.sh@18 -- # rm -f 00:08:13.339 00:08:13.339 real 0m21.042s 00:08:13.339 user 0m35.941s 00:08:13.339 sys 0m7.101s 00:08:13.339 12:02:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.339 12:02:26 -- common/autotest_common.sh@10 -- # set +x 00:08:13.339 ************************************ 00:08:13.339 END TEST cpu_locks 00:08:13.339 ************************************ 00:08:13.339 00:08:13.339 real 0m46.542s 00:08:13.339 user 1m25.853s 00:08:13.339 sys 0m11.954s 00:08:13.339 12:02:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.339 12:02:26 -- common/autotest_common.sh@10 -- # set +x 00:08:13.339 ************************************ 00:08:13.339 END TEST event 00:08:13.339 ************************************ 00:08:13.598 12:02:26 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:08:13.598 12:02:26 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:13.598 12:02:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:13.598 12:02:26 -- common/autotest_common.sh@10 -- # set +x 00:08:13.598 ************************************ 00:08:13.598 START TEST thread 00:08:13.598 ************************************ 00:08:13.598 12:02:26 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:08:13.598 * Looking for test storage... 00:08:13.598 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:08:13.599 12:02:26 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:13.599 12:02:26 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:13.599 12:02:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:13.599 12:02:26 -- common/autotest_common.sh@10 -- # set +x 00:08:13.599 ************************************ 00:08:13.599 START TEST thread_poller_perf 00:08:13.599 ************************************ 00:08:13.599 12:02:26 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:13.599 [2024-06-11 12:02:26.542984] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:13.599 [2024-06-11 12:02:26.543075] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2680748 ] 00:08:13.599 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.857 [2024-06-11 12:02:26.662497] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.857 [2024-06-11 12:02:26.706510] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.857 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:08:14.796 ====================================== 00:08:14.796 busy:2305491676 (cyc) 00:08:14.796 total_run_count: 504000 00:08:14.796 tsc_hz: 2300000000 (cyc) 00:08:14.796 ====================================== 00:08:14.796 poller_cost: 4574 (cyc), 1988 (nsec) 00:08:14.796 00:08:14.796 real 0m1.255s 00:08:14.796 user 0m1.119s 00:08:14.796 sys 0m0.129s 00:08:14.796 12:02:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.796 12:02:27 -- common/autotest_common.sh@10 -- # set +x 00:08:14.796 ************************************ 00:08:14.796 END TEST thread_poller_perf 00:08:14.796 ************************************ 00:08:14.796 12:02:27 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:14.796 12:02:27 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:14.796 12:02:27 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:14.796 12:02:27 -- common/autotest_common.sh@10 -- # set +x 00:08:14.796 ************************************ 00:08:14.796 START TEST thread_poller_perf 00:08:14.796 ************************************ 00:08:14.796 12:02:27 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:15.055 [2024-06-11 12:02:27.832325] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:15.055 [2024-06-11 12:02:27.832383] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2680944 ] 00:08:15.055 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.055 [2024-06-11 12:02:27.932955] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.055 [2024-06-11 12:02:27.980508] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.055 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:16.053 ====================================== 00:08:16.053 busy:2302667788 (cyc) 00:08:16.053 total_run_count: 8835000 00:08:16.053 tsc_hz: 2300000000 (cyc) 00:08:16.053 ====================================== 00:08:16.053 poller_cost: 260 (cyc), 113 (nsec) 00:08:16.053 00:08:16.053 real 0m1.228s 00:08:16.053 user 0m1.112s 00:08:16.053 sys 0m0.110s 00:08:16.053 12:02:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.053 12:02:29 -- common/autotest_common.sh@10 -- # set +x 00:08:16.053 ************************************ 00:08:16.053 END TEST thread_poller_perf 00:08:16.053 ************************************ 00:08:16.313 12:02:29 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:08:16.313 12:02:29 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:08:16.313 12:02:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:16.313 12:02:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:16.313 12:02:29 -- common/autotest_common.sh@10 -- # set +x 00:08:16.313 ************************************ 00:08:16.313 START TEST thread_spdk_lock 00:08:16.313 ************************************ 00:08:16.313 12:02:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:08:16.313 [2024-06-11 12:02:29.118018] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:16.313 [2024-06-11 12:02:29.118124] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681141 ] 00:08:16.313 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.313 [2024-06-11 12:02:29.228875] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:16.313 [2024-06-11 12:02:29.279637] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:16.313 [2024-06-11 12:02:29.279641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.881 [2024-06-11 12:02:29.784101] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:16.881 [2024-06-11 12:02:29.784149] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:08:16.881 [2024-06-11 12:02:29.784165] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x133de80 00:08:16.881 [2024-06-11 12:02:29.785124] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:16.881 [2024-06-11 12:02:29.785227] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:16.881 [2024-06-11 12:02:29.785252] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:08:16.881 Starting test contend 00:08:16.881 Worker Delay Wait us Hold us Total us 00:08:16.881 0 3 156414 190028 346443 00:08:16.881 1 5 79938 291093 371032 00:08:16.881 PASS test contend 00:08:16.881 Starting test hold_by_poller 00:08:16.881 PASS test hold_by_poller 00:08:16.881 Starting test hold_by_message 00:08:16.881 PASS test hold_by_message 00:08:16.881 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:08:16.881 100014 assertions passed 00:08:16.881 0 assertions failed 00:08:16.881 00:08:16.881 real 0m0.749s 00:08:16.881 user 0m1.121s 00:08:16.881 sys 0m0.128s 00:08:16.881 12:02:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.881 12:02:29 -- common/autotest_common.sh@10 -- # set +x 00:08:16.881 ************************************ 00:08:16.881 END TEST thread_spdk_lock 00:08:16.881 ************************************ 00:08:16.881 00:08:16.881 real 0m3.472s 00:08:16.881 user 0m3.439s 00:08:16.881 sys 0m0.555s 00:08:16.881 12:02:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:16.881 12:02:29 -- common/autotest_common.sh@10 -- # set +x 00:08:16.881 ************************************ 00:08:16.881 END TEST thread 00:08:16.881 ************************************ 00:08:17.140 12:02:29 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:08:17.140 12:02:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:17.140 12:02:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:17.140 12:02:29 -- common/autotest_common.sh@10 -- # set +x 00:08:17.140 ************************************ 00:08:17.140 START TEST accel 00:08:17.140 ************************************ 00:08:17.140 12:02:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:08:17.140 * Looking for test storage... 00:08:17.140 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:08:17.140 12:02:30 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:08:17.140 12:02:30 -- accel/accel.sh@74 -- # get_expected_opcs 00:08:17.140 12:02:30 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:17.140 12:02:30 -- accel/accel.sh@59 -- # spdk_tgt_pid=2681374 00:08:17.140 12:02:30 -- accel/accel.sh@60 -- # waitforlisten 2681374 00:08:17.140 12:02:30 -- common/autotest_common.sh@819 -- # '[' -z 2681374 ']' 00:08:17.140 12:02:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:17.140 12:02:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:08:17.140 12:02:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:17.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:17.140 12:02:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:08:17.140 12:02:30 -- common/autotest_common.sh@10 -- # set +x 00:08:17.140 12:02:30 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:17.140 12:02:30 -- accel/accel.sh@58 -- # build_accel_config 00:08:17.140 12:02:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:17.140 12:02:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.140 12:02:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.140 12:02:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:17.140 12:02:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:17.140 12:02:30 -- accel/accel.sh@41 -- # local IFS=, 00:08:17.140 12:02:30 -- accel/accel.sh@42 -- # jq -r . 00:08:17.140 [2024-06-11 12:02:30.061551] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:17.140 [2024-06-11 12:02:30.061649] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681374 ] 00:08:17.140 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.399 [2024-06-11 12:02:30.179553] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.399 [2024-06-11 12:02:30.225441] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.399 [2024-06-11 12:02:30.225595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.967 12:02:30 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:08:17.967 12:02:30 -- common/autotest_common.sh@852 -- # return 0 00:08:17.967 12:02:30 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:17.967 12:02:30 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:17.967 12:02:30 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:08:17.967 12:02:30 -- common/autotest_common.sh@551 -- # xtrace_disable 00:08:17.967 12:02:30 -- common/autotest_common.sh@10 -- # set +x 00:08:17.967 12:02:30 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.967 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.967 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.967 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.968 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.968 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.968 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.968 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.968 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.968 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.968 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.968 12:02:30 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # IFS== 00:08:17.968 12:02:30 -- accel/accel.sh@64 -- # read -r opc module 00:08:17.968 12:02:30 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:08:17.968 12:02:30 -- accel/accel.sh@67 -- # killprocess 2681374 00:08:17.968 12:02:30 -- common/autotest_common.sh@926 -- # '[' -z 2681374 ']' 00:08:17.968 12:02:30 -- common/autotest_common.sh@930 -- # kill -0 2681374 00:08:17.968 12:02:30 -- common/autotest_common.sh@931 -- # uname 00:08:17.968 12:02:30 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:08:17.968 12:02:30 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2681374 00:08:18.227 12:02:31 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:08:18.227 12:02:31 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:08:18.227 12:02:31 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2681374' 00:08:18.227 killing process with pid 2681374 00:08:18.227 12:02:31 -- common/autotest_common.sh@945 -- # kill 2681374 00:08:18.227 12:02:31 -- common/autotest_common.sh@950 -- # wait 2681374 00:08:18.487 12:02:31 -- accel/accel.sh@68 -- # trap - ERR 00:08:18.487 12:02:31 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:08:18.487 12:02:31 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:08:18.487 12:02:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:18.487 12:02:31 -- common/autotest_common.sh@10 -- # set +x 00:08:18.487 12:02:31 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:08:18.487 12:02:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:18.487 12:02:31 -- accel/accel.sh@12 -- # build_accel_config 00:08:18.487 12:02:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:18.487 12:02:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.487 12:02:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.487 12:02:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:18.487 12:02:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:18.487 12:02:31 -- accel/accel.sh@41 -- # local IFS=, 00:08:18.487 12:02:31 -- accel/accel.sh@42 -- # jq -r . 00:08:18.487 12:02:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:18.487 12:02:31 -- common/autotest_common.sh@10 -- # set +x 00:08:18.487 12:02:31 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:18.487 12:02:31 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:18.487 12:02:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:18.487 12:02:31 -- common/autotest_common.sh@10 -- # set +x 00:08:18.487 ************************************ 00:08:18.487 START TEST accel_missing_filename 00:08:18.487 ************************************ 00:08:18.487 12:02:31 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:08:18.487 12:02:31 -- common/autotest_common.sh@640 -- # local es=0 00:08:18.487 12:02:31 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:18.487 12:02:31 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:18.488 12:02:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:18.488 12:02:31 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:18.488 12:02:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:18.488 12:02:31 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:08:18.488 12:02:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:18.488 12:02:31 -- accel/accel.sh@12 -- # build_accel_config 00:08:18.488 12:02:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:18.488 12:02:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.488 12:02:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.488 12:02:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:18.488 12:02:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:18.488 12:02:31 -- accel/accel.sh@41 -- # local IFS=, 00:08:18.488 12:02:31 -- accel/accel.sh@42 -- # jq -r . 00:08:18.488 [2024-06-11 12:02:31.444198] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:18.488 [2024-06-11 12:02:31.444288] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681595 ] 00:08:18.488 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.747 [2024-06-11 12:02:31.564953] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.747 [2024-06-11 12:02:31.613168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.747 [2024-06-11 12:02:31.663206] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:18.747 [2024-06-11 12:02:31.735816] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:08:19.006 A filename is required. 00:08:19.006 12:02:31 -- common/autotest_common.sh@643 -- # es=234 00:08:19.006 12:02:31 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:19.006 12:02:31 -- common/autotest_common.sh@652 -- # es=106 00:08:19.006 12:02:31 -- common/autotest_common.sh@653 -- # case "$es" in 00:08:19.006 12:02:31 -- common/autotest_common.sh@660 -- # es=1 00:08:19.006 12:02:31 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:19.006 00:08:19.006 real 0m0.389s 00:08:19.006 user 0m0.242s 00:08:19.006 sys 0m0.188s 00:08:19.006 12:02:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.006 12:02:31 -- common/autotest_common.sh@10 -- # set +x 00:08:19.006 ************************************ 00:08:19.006 END TEST accel_missing_filename 00:08:19.006 ************************************ 00:08:19.007 12:02:31 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.007 12:02:31 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:08:19.007 12:02:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.007 12:02:31 -- common/autotest_common.sh@10 -- # set +x 00:08:19.007 ************************************ 00:08:19.007 START TEST accel_compress_verify 00:08:19.007 ************************************ 00:08:19.007 12:02:31 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.007 12:02:31 -- common/autotest_common.sh@640 -- # local es=0 00:08:19.007 12:02:31 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.007 12:02:31 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:19.007 12:02:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.007 12:02:31 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:19.007 12:02:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.007 12:02:31 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.007 12:02:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:19.007 12:02:31 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.007 12:02:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.007 12:02:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.007 12:02:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.007 12:02:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.007 12:02:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.007 12:02:31 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.007 12:02:31 -- accel/accel.sh@42 -- # jq -r . 00:08:19.007 [2024-06-11 12:02:31.868049] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:19.007 [2024-06-11 12:02:31.868101] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681614 ] 00:08:19.007 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.007 [2024-06-11 12:02:31.968479] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.007 [2024-06-11 12:02:32.016063] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.265 [2024-06-11 12:02:32.065902] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:19.265 [2024-06-11 12:02:32.138736] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:08:19.265 00:08:19.265 Compression does not support the verify option, aborting. 00:08:19.265 12:02:32 -- common/autotest_common.sh@643 -- # es=161 00:08:19.265 12:02:32 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:19.265 12:02:32 -- common/autotest_common.sh@652 -- # es=33 00:08:19.265 12:02:32 -- common/autotest_common.sh@653 -- # case "$es" in 00:08:19.265 12:02:32 -- common/autotest_common.sh@660 -- # es=1 00:08:19.265 12:02:32 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:19.265 00:08:19.265 real 0m0.356s 00:08:19.265 user 0m0.227s 00:08:19.265 sys 0m0.171s 00:08:19.265 12:02:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.265 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:08:19.265 ************************************ 00:08:19.265 END TEST accel_compress_verify 00:08:19.265 ************************************ 00:08:19.265 12:02:32 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:19.265 12:02:32 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:19.265 12:02:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.265 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:08:19.265 ************************************ 00:08:19.265 START TEST accel_wrong_workload 00:08:19.265 ************************************ 00:08:19.265 12:02:32 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:08:19.265 12:02:32 -- common/autotest_common.sh@640 -- # local es=0 00:08:19.265 12:02:32 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:19.265 12:02:32 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:19.265 12:02:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.265 12:02:32 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:19.265 12:02:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.265 12:02:32 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:08:19.265 12:02:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:19.265 12:02:32 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.265 12:02:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.265 12:02:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.265 12:02:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.265 12:02:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.265 12:02:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.265 12:02:32 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.265 12:02:32 -- accel/accel.sh@42 -- # jq -r . 00:08:19.265 Unsupported workload type: foobar 00:08:19.265 [2024-06-11 12:02:32.282852] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:19.265 accel_perf options: 00:08:19.265 [-h help message] 00:08:19.265 [-q queue depth per core] 00:08:19.265 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:19.265 [-T number of threads per core 00:08:19.265 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:19.265 [-t time in seconds] 00:08:19.265 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:19.265 [ dif_verify, , dif_generate, dif_generate_copy 00:08:19.265 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:19.265 [-l for compress/decompress workloads, name of uncompressed input file 00:08:19.265 [-S for crc32c workload, use this seed value (default 0) 00:08:19.265 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:19.265 [-f for fill workload, use this BYTE value (default 255) 00:08:19.265 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:19.265 [-y verify result if this switch is on] 00:08:19.265 [-a tasks to allocate per core (default: same value as -q)] 00:08:19.265 Can be used to spread operations across a wider range of memory. 00:08:19.265 12:02:32 -- common/autotest_common.sh@643 -- # es=1 00:08:19.265 12:02:32 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:19.265 12:02:32 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:19.265 12:02:32 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:19.265 00:08:19.265 real 0m0.028s 00:08:19.265 user 0m0.008s 00:08:19.265 sys 0m0.020s 00:08:19.265 12:02:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.265 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:08:19.265 ************************************ 00:08:19.265 END TEST accel_wrong_workload 00:08:19.265 ************************************ 00:08:19.524 Error: writing output failed: Broken pipe 00:08:19.524 12:02:32 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:19.524 12:02:32 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:08:19.524 12:02:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.524 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:08:19.524 ************************************ 00:08:19.524 START TEST accel_negative_buffers 00:08:19.524 ************************************ 00:08:19.524 12:02:32 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:19.524 12:02:32 -- common/autotest_common.sh@640 -- # local es=0 00:08:19.524 12:02:32 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:19.524 12:02:32 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:08:19.524 12:02:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.524 12:02:32 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:08:19.524 12:02:32 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:08:19.524 12:02:32 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:08:19.524 12:02:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:19.524 12:02:32 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.524 12:02:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.524 12:02:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.524 12:02:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.524 12:02:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.524 12:02:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.524 12:02:32 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.524 12:02:32 -- accel/accel.sh@42 -- # jq -r . 00:08:19.524 -x option must be non-negative. 00:08:19.524 [2024-06-11 12:02:32.358471] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:19.524 accel_perf options: 00:08:19.524 [-h help message] 00:08:19.524 [-q queue depth per core] 00:08:19.524 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:19.524 [-T number of threads per core 00:08:19.524 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:19.524 [-t time in seconds] 00:08:19.524 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:19.524 [ dif_verify, , dif_generate, dif_generate_copy 00:08:19.524 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:19.524 [-l for compress/decompress workloads, name of uncompressed input file 00:08:19.524 [-S for crc32c workload, use this seed value (default 0) 00:08:19.524 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:19.524 [-f for fill workload, use this BYTE value (default 255) 00:08:19.524 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:19.524 [-y verify result if this switch is on] 00:08:19.524 [-a tasks to allocate per core (default: same value as -q)] 00:08:19.524 Can be used to spread operations across a wider range of memory. 00:08:19.524 12:02:32 -- common/autotest_common.sh@643 -- # es=1 00:08:19.524 12:02:32 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:08:19.524 12:02:32 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:08:19.524 12:02:32 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:08:19.524 00:08:19.524 real 0m0.028s 00:08:19.524 user 0m0.013s 00:08:19.524 sys 0m0.015s 00:08:19.524 12:02:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.524 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:08:19.524 ************************************ 00:08:19.524 END TEST accel_negative_buffers 00:08:19.524 ************************************ 00:08:19.524 Error: writing output failed: Broken pipe 00:08:19.524 12:02:32 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:19.524 12:02:32 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:19.524 12:02:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:19.524 12:02:32 -- common/autotest_common.sh@10 -- # set +x 00:08:19.524 ************************************ 00:08:19.524 START TEST accel_crc32c 00:08:19.524 ************************************ 00:08:19.524 12:02:32 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:19.524 12:02:32 -- accel/accel.sh@16 -- # local accel_opc 00:08:19.524 12:02:32 -- accel/accel.sh@17 -- # local accel_module 00:08:19.524 12:02:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:19.524 12:02:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:19.524 12:02:32 -- accel/accel.sh@12 -- # build_accel_config 00:08:19.524 12:02:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:19.524 12:02:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.524 12:02:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.524 12:02:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:19.524 12:02:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:19.524 12:02:32 -- accel/accel.sh@41 -- # local IFS=, 00:08:19.524 12:02:32 -- accel/accel.sh@42 -- # jq -r . 00:08:19.524 [2024-06-11 12:02:32.436005] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:19.524 [2024-06-11 12:02:32.436110] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681767 ] 00:08:19.524 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.784 [2024-06-11 12:02:32.557731] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.784 [2024-06-11 12:02:32.606156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.162 12:02:33 -- accel/accel.sh@18 -- # out=' 00:08:21.162 SPDK Configuration: 00:08:21.162 Core mask: 0x1 00:08:21.162 00:08:21.162 Accel Perf Configuration: 00:08:21.162 Workload Type: crc32c 00:08:21.162 CRC-32C seed: 32 00:08:21.162 Transfer size: 4096 bytes 00:08:21.162 Vector count 1 00:08:21.162 Module: software 00:08:21.162 Queue depth: 32 00:08:21.162 Allocate depth: 32 00:08:21.162 # threads/core: 1 00:08:21.162 Run time: 1 seconds 00:08:21.162 Verify: Yes 00:08:21.162 00:08:21.162 Running for 1 seconds... 00:08:21.162 00:08:21.162 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:21.162 ------------------------------------------------------------------------------------ 00:08:21.162 0,0 522624/s 2041 MiB/s 0 0 00:08:21.162 ==================================================================================== 00:08:21.162 Total 522624/s 2041 MiB/s 0 0' 00:08:21.162 12:02:33 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:33 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:21.162 12:02:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:21.162 12:02:33 -- accel/accel.sh@12 -- # build_accel_config 00:08:21.162 12:02:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:21.162 12:02:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.162 12:02:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.162 12:02:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:21.162 12:02:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:21.162 12:02:33 -- accel/accel.sh@41 -- # local IFS=, 00:08:21.162 12:02:33 -- accel/accel.sh@42 -- # jq -r . 00:08:21.162 [2024-06-11 12:02:33.826499] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:21.162 [2024-06-11 12:02:33.826589] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681966 ] 00:08:21.162 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.162 [2024-06-11 12:02:33.948044] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.162 [2024-06-11 12:02:33.995730] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val=0x1 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val=crc32c 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val=32 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.162 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.162 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.162 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val=software 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@23 -- # accel_module=software 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val=32 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val=32 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val=1 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val=Yes 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:21.163 12:02:34 -- accel/accel.sh@21 -- # val= 00:08:21.163 12:02:34 -- accel/accel.sh@22 -- # case "$var" in 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # IFS=: 00:08:21.163 12:02:34 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@21 -- # val= 00:08:22.541 12:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # IFS=: 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@21 -- # val= 00:08:22.541 12:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # IFS=: 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@21 -- # val= 00:08:22.541 12:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # IFS=: 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@21 -- # val= 00:08:22.541 12:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # IFS=: 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@21 -- # val= 00:08:22.541 12:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # IFS=: 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@21 -- # val= 00:08:22.541 12:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # IFS=: 00:08:22.541 12:02:35 -- accel/accel.sh@20 -- # read -r var val 00:08:22.541 12:02:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:22.541 12:02:35 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:08:22.541 12:02:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:22.541 00:08:22.541 real 0m2.770s 00:08:22.541 user 0m2.403s 00:08:22.541 sys 0m0.372s 00:08:22.541 12:02:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.541 12:02:35 -- common/autotest_common.sh@10 -- # set +x 00:08:22.541 ************************************ 00:08:22.541 END TEST accel_crc32c 00:08:22.541 ************************************ 00:08:22.541 12:02:35 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:22.541 12:02:35 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:22.541 12:02:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:22.541 12:02:35 -- common/autotest_common.sh@10 -- # set +x 00:08:22.541 ************************************ 00:08:22.541 START TEST accel_crc32c_C2 00:08:22.541 ************************************ 00:08:22.541 12:02:35 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:22.541 12:02:35 -- accel/accel.sh@16 -- # local accel_opc 00:08:22.541 12:02:35 -- accel/accel.sh@17 -- # local accel_module 00:08:22.541 12:02:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:22.541 12:02:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:22.541 12:02:35 -- accel/accel.sh@12 -- # build_accel_config 00:08:22.541 12:02:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:22.542 12:02:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:22.542 12:02:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:22.542 12:02:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:22.542 12:02:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:22.542 12:02:35 -- accel/accel.sh@41 -- # local IFS=, 00:08:22.542 12:02:35 -- accel/accel.sh@42 -- # jq -r . 00:08:22.542 [2024-06-11 12:02:35.234777] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:22.542 [2024-06-11 12:02:35.234830] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2682213 ] 00:08:22.542 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.542 [2024-06-11 12:02:35.334931] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.542 [2024-06-11 12:02:35.382581] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.922 12:02:36 -- accel/accel.sh@18 -- # out=' 00:08:23.922 SPDK Configuration: 00:08:23.922 Core mask: 0x1 00:08:23.922 00:08:23.922 Accel Perf Configuration: 00:08:23.922 Workload Type: crc32c 00:08:23.922 CRC-32C seed: 0 00:08:23.922 Transfer size: 4096 bytes 00:08:23.922 Vector count 2 00:08:23.922 Module: software 00:08:23.922 Queue depth: 32 00:08:23.922 Allocate depth: 32 00:08:23.922 # threads/core: 1 00:08:23.922 Run time: 1 seconds 00:08:23.922 Verify: Yes 00:08:23.922 00:08:23.922 Running for 1 seconds... 00:08:23.922 00:08:23.922 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:23.922 ------------------------------------------------------------------------------------ 00:08:23.922 0,0 381248/s 2978 MiB/s 0 0 00:08:23.922 ==================================================================================== 00:08:23.922 Total 381248/s 1489 MiB/s 0 0' 00:08:23.922 12:02:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:23.922 12:02:36 -- accel/accel.sh@12 -- # build_accel_config 00:08:23.922 12:02:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:23.922 12:02:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.922 12:02:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.922 12:02:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:23.922 12:02:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:23.922 12:02:36 -- accel/accel.sh@41 -- # local IFS=, 00:08:23.922 12:02:36 -- accel/accel.sh@42 -- # jq -r . 00:08:23.922 [2024-06-11 12:02:36.578953] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:23.922 [2024-06-11 12:02:36.579022] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2682401 ] 00:08:23.922 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.922 [2024-06-11 12:02:36.684288] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.922 [2024-06-11 12:02:36.731770] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=0x1 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=crc32c 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=0 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=software 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@23 -- # accel_module=software 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=32 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=32 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=1 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val=Yes 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:23.922 12:02:36 -- accel/accel.sh@21 -- # val= 00:08:23.922 12:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # IFS=: 00:08:23.922 12:02:36 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@21 -- # val= 00:08:25.300 12:02:37 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # IFS=: 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@21 -- # val= 00:08:25.300 12:02:37 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # IFS=: 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@21 -- # val= 00:08:25.300 12:02:37 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # IFS=: 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@21 -- # val= 00:08:25.300 12:02:37 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # IFS=: 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@21 -- # val= 00:08:25.300 12:02:37 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # IFS=: 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@21 -- # val= 00:08:25.300 12:02:37 -- accel/accel.sh@22 -- # case "$var" in 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # IFS=: 00:08:25.300 12:02:37 -- accel/accel.sh@20 -- # read -r var val 00:08:25.300 12:02:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:25.300 12:02:37 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:08:25.300 12:02:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.300 00:08:25.300 real 0m2.698s 00:08:25.300 user 0m2.372s 00:08:25.300 sys 0m0.333s 00:08:25.300 12:02:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.300 12:02:37 -- common/autotest_common.sh@10 -- # set +x 00:08:25.300 ************************************ 00:08:25.300 END TEST accel_crc32c_C2 00:08:25.300 ************************************ 00:08:25.300 12:02:37 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:25.300 12:02:37 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:25.301 12:02:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:25.301 12:02:37 -- common/autotest_common.sh@10 -- # set +x 00:08:25.301 ************************************ 00:08:25.301 START TEST accel_copy 00:08:25.301 ************************************ 00:08:25.301 12:02:37 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:08:25.301 12:02:37 -- accel/accel.sh@16 -- # local accel_opc 00:08:25.301 12:02:37 -- accel/accel.sh@17 -- # local accel_module 00:08:25.301 12:02:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:08:25.301 12:02:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:25.301 12:02:37 -- accel/accel.sh@12 -- # build_accel_config 00:08:25.301 12:02:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:25.301 12:02:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.301 12:02:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.301 12:02:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:25.301 12:02:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:25.301 12:02:37 -- accel/accel.sh@41 -- # local IFS=, 00:08:25.301 12:02:37 -- accel/accel.sh@42 -- # jq -r . 00:08:25.301 [2024-06-11 12:02:37.989501] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:25.301 [2024-06-11 12:02:37.989582] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2682597 ] 00:08:25.301 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.301 [2024-06-11 12:02:38.108399] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.301 [2024-06-11 12:02:38.155880] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.679 12:02:39 -- accel/accel.sh@18 -- # out=' 00:08:26.679 SPDK Configuration: 00:08:26.679 Core mask: 0x1 00:08:26.679 00:08:26.679 Accel Perf Configuration: 00:08:26.679 Workload Type: copy 00:08:26.679 Transfer size: 4096 bytes 00:08:26.679 Vector count 1 00:08:26.679 Module: software 00:08:26.679 Queue depth: 32 00:08:26.679 Allocate depth: 32 00:08:26.679 # threads/core: 1 00:08:26.679 Run time: 1 seconds 00:08:26.679 Verify: Yes 00:08:26.679 00:08:26.679 Running for 1 seconds... 00:08:26.679 00:08:26.679 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:26.679 ------------------------------------------------------------------------------------ 00:08:26.679 0,0 334944/s 1308 MiB/s 0 0 00:08:26.679 ==================================================================================== 00:08:26.679 Total 334944/s 1308 MiB/s 0 0' 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:26.679 12:02:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:26.679 12:02:39 -- accel/accel.sh@12 -- # build_accel_config 00:08:26.679 12:02:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:26.679 12:02:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:26.679 12:02:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:26.679 12:02:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:26.679 12:02:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:26.679 12:02:39 -- accel/accel.sh@41 -- # local IFS=, 00:08:26.679 12:02:39 -- accel/accel.sh@42 -- # jq -r . 00:08:26.679 [2024-06-11 12:02:39.374188] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:26.679 [2024-06-11 12:02:39.374278] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2682777 ] 00:08:26.679 EAL: No free 2048 kB hugepages reported on node 1 00:08:26.679 [2024-06-11 12:02:39.493618] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.679 [2024-06-11 12:02:39.540878] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=0x1 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=copy 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@24 -- # accel_opc=copy 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=software 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@23 -- # accel_module=software 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=32 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=32 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=1 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val=Yes 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:26.679 12:02:39 -- accel/accel.sh@21 -- # val= 00:08:26.679 12:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # IFS=: 00:08:26.679 12:02:39 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@21 -- # val= 00:08:28.059 12:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # IFS=: 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@21 -- # val= 00:08:28.059 12:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # IFS=: 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@21 -- # val= 00:08:28.059 12:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # IFS=: 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@21 -- # val= 00:08:28.059 12:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # IFS=: 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@21 -- # val= 00:08:28.059 12:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # IFS=: 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@21 -- # val= 00:08:28.059 12:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # IFS=: 00:08:28.059 12:02:40 -- accel/accel.sh@20 -- # read -r var val 00:08:28.059 12:02:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:28.059 12:02:40 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:08:28.059 12:02:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:28.059 00:08:28.059 real 0m2.777s 00:08:28.059 user 0m2.403s 00:08:28.059 sys 0m0.375s 00:08:28.059 12:02:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.059 12:02:40 -- common/autotest_common.sh@10 -- # set +x 00:08:28.059 ************************************ 00:08:28.059 END TEST accel_copy 00:08:28.059 ************************************ 00:08:28.059 12:02:40 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.059 12:02:40 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:08:28.059 12:02:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:28.059 12:02:40 -- common/autotest_common.sh@10 -- # set +x 00:08:28.059 ************************************ 00:08:28.059 START TEST accel_fill 00:08:28.059 ************************************ 00:08:28.059 12:02:40 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.059 12:02:40 -- accel/accel.sh@16 -- # local accel_opc 00:08:28.059 12:02:40 -- accel/accel.sh@17 -- # local accel_module 00:08:28.059 12:02:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.059 12:02:40 -- accel/accel.sh@12 -- # build_accel_config 00:08:28.059 12:02:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:28.059 12:02:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:28.059 12:02:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.059 12:02:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.059 12:02:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:28.059 12:02:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:28.059 12:02:40 -- accel/accel.sh@41 -- # local IFS=, 00:08:28.059 12:02:40 -- accel/accel.sh@42 -- # jq -r . 00:08:28.059 [2024-06-11 12:02:40.812792] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:28.059 [2024-06-11 12:02:40.812876] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2682976 ] 00:08:28.059 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.059 [2024-06-11 12:02:40.930959] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.059 [2024-06-11 12:02:40.978571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.436 12:02:42 -- accel/accel.sh@18 -- # out=' 00:08:29.436 SPDK Configuration: 00:08:29.436 Core mask: 0x1 00:08:29.436 00:08:29.436 Accel Perf Configuration: 00:08:29.436 Workload Type: fill 00:08:29.436 Fill pattern: 0x80 00:08:29.436 Transfer size: 4096 bytes 00:08:29.436 Vector count 1 00:08:29.436 Module: software 00:08:29.436 Queue depth: 64 00:08:29.436 Allocate depth: 64 00:08:29.436 # threads/core: 1 00:08:29.436 Run time: 1 seconds 00:08:29.436 Verify: Yes 00:08:29.436 00:08:29.436 Running for 1 seconds... 00:08:29.436 00:08:29.436 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:29.436 ------------------------------------------------------------------------------------ 00:08:29.436 0,0 596160/s 2328 MiB/s 0 0 00:08:29.436 ==================================================================================== 00:08:29.436 Total 596160/s 2328 MiB/s 0 0' 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:29.436 12:02:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:29.436 12:02:42 -- accel/accel.sh@12 -- # build_accel_config 00:08:29.436 12:02:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:29.436 12:02:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.436 12:02:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.436 12:02:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:29.436 12:02:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:29.436 12:02:42 -- accel/accel.sh@41 -- # local IFS=, 00:08:29.436 12:02:42 -- accel/accel.sh@42 -- # jq -r . 00:08:29.436 [2024-06-11 12:02:42.197648] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:29.436 [2024-06-11 12:02:42.197744] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2683156 ] 00:08:29.436 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.436 [2024-06-11 12:02:42.316196] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.436 [2024-06-11 12:02:42.363486] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=0x1 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=fill 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@24 -- # accel_opc=fill 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=0x80 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=software 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@23 -- # accel_module=software 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=64 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=64 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=1 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val=Yes 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:29.436 12:02:42 -- accel/accel.sh@21 -- # val= 00:08:29.436 12:02:42 -- accel/accel.sh@22 -- # case "$var" in 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # IFS=: 00:08:29.436 12:02:42 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@21 -- # val= 00:08:30.811 12:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # IFS=: 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@21 -- # val= 00:08:30.811 12:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # IFS=: 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@21 -- # val= 00:08:30.811 12:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # IFS=: 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@21 -- # val= 00:08:30.811 12:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # IFS=: 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@21 -- # val= 00:08:30.811 12:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # IFS=: 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@21 -- # val= 00:08:30.811 12:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # IFS=: 00:08:30.811 12:02:43 -- accel/accel.sh@20 -- # read -r var val 00:08:30.811 12:02:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:30.811 12:02:43 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:08:30.811 12:02:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:30.811 00:08:30.811 real 0m2.776s 00:08:30.811 user 0m2.408s 00:08:30.811 sys 0m0.371s 00:08:30.811 12:02:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.811 12:02:43 -- common/autotest_common.sh@10 -- # set +x 00:08:30.811 ************************************ 00:08:30.811 END TEST accel_fill 00:08:30.811 ************************************ 00:08:30.811 12:02:43 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:30.811 12:02:43 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:30.811 12:02:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:30.811 12:02:43 -- common/autotest_common.sh@10 -- # set +x 00:08:30.811 ************************************ 00:08:30.811 START TEST accel_copy_crc32c 00:08:30.811 ************************************ 00:08:30.811 12:02:43 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:08:30.811 12:02:43 -- accel/accel.sh@16 -- # local accel_opc 00:08:30.811 12:02:43 -- accel/accel.sh@17 -- # local accel_module 00:08:30.811 12:02:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:30.811 12:02:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:30.811 12:02:43 -- accel/accel.sh@12 -- # build_accel_config 00:08:30.811 12:02:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:30.811 12:02:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:30.811 12:02:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:30.811 12:02:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:30.811 12:02:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:30.811 12:02:43 -- accel/accel.sh@41 -- # local IFS=, 00:08:30.811 12:02:43 -- accel/accel.sh@42 -- # jq -r . 00:08:30.811 [2024-06-11 12:02:43.629609] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:30.811 [2024-06-11 12:02:43.629688] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2683358 ] 00:08:30.811 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.811 [2024-06-11 12:02:43.748599] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.811 [2024-06-11 12:02:43.796081] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.188 12:02:44 -- accel/accel.sh@18 -- # out=' 00:08:32.188 SPDK Configuration: 00:08:32.188 Core mask: 0x1 00:08:32.188 00:08:32.188 Accel Perf Configuration: 00:08:32.188 Workload Type: copy_crc32c 00:08:32.188 CRC-32C seed: 0 00:08:32.188 Vector size: 4096 bytes 00:08:32.188 Transfer size: 4096 bytes 00:08:32.188 Vector count 1 00:08:32.188 Module: software 00:08:32.188 Queue depth: 32 00:08:32.188 Allocate depth: 32 00:08:32.188 # threads/core: 1 00:08:32.188 Run time: 1 seconds 00:08:32.188 Verify: Yes 00:08:32.188 00:08:32.188 Running for 1 seconds... 00:08:32.188 00:08:32.188 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:32.188 ------------------------------------------------------------------------------------ 00:08:32.188 0,0 268160/s 1047 MiB/s 0 0 00:08:32.188 ==================================================================================== 00:08:32.188 Total 268160/s 1047 MiB/s 0 0' 00:08:32.188 12:02:44 -- accel/accel.sh@20 -- # IFS=: 00:08:32.188 12:02:44 -- accel/accel.sh@20 -- # read -r var val 00:08:32.188 12:02:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:32.188 12:02:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:32.188 12:02:44 -- accel/accel.sh@12 -- # build_accel_config 00:08:32.188 12:02:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:32.188 12:02:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:32.188 12:02:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:32.188 12:02:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:32.188 12:02:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:32.188 12:02:44 -- accel/accel.sh@41 -- # local IFS=, 00:08:32.188 12:02:44 -- accel/accel.sh@42 -- # jq -r . 00:08:32.188 [2024-06-11 12:02:45.013574] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:32.188 [2024-06-11 12:02:45.013667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2683536 ] 00:08:32.188 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.188 [2024-06-11 12:02:45.130581] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.188 [2024-06-11 12:02:45.177787] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=0x1 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=copy_crc32c 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=0 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=software 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@23 -- # accel_module=software 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=32 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=32 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=1 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val=Yes 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:32.447 12:02:45 -- accel/accel.sh@21 -- # val= 00:08:32.447 12:02:45 -- accel/accel.sh@22 -- # case "$var" in 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # IFS=: 00:08:32.447 12:02:45 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@21 -- # val= 00:08:33.384 12:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # IFS=: 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@21 -- # val= 00:08:33.384 12:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # IFS=: 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@21 -- # val= 00:08:33.384 12:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # IFS=: 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@21 -- # val= 00:08:33.384 12:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # IFS=: 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@21 -- # val= 00:08:33.384 12:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # IFS=: 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@21 -- # val= 00:08:33.384 12:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # IFS=: 00:08:33.384 12:02:46 -- accel/accel.sh@20 -- # read -r var val 00:08:33.384 12:02:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:33.384 12:02:46 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:08:33.384 12:02:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:33.384 00:08:33.384 real 0m2.766s 00:08:33.384 user 0m2.402s 00:08:33.384 sys 0m0.368s 00:08:33.384 12:02:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.384 12:02:46 -- common/autotest_common.sh@10 -- # set +x 00:08:33.384 ************************************ 00:08:33.384 END TEST accel_copy_crc32c 00:08:33.384 ************************************ 00:08:33.643 12:02:46 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:33.643 12:02:46 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:33.643 12:02:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:33.643 12:02:46 -- common/autotest_common.sh@10 -- # set +x 00:08:33.643 ************************************ 00:08:33.643 START TEST accel_copy_crc32c_C2 00:08:33.643 ************************************ 00:08:33.643 12:02:46 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:33.643 12:02:46 -- accel/accel.sh@16 -- # local accel_opc 00:08:33.643 12:02:46 -- accel/accel.sh@17 -- # local accel_module 00:08:33.643 12:02:46 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:33.643 12:02:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:33.643 12:02:46 -- accel/accel.sh@12 -- # build_accel_config 00:08:33.643 12:02:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:33.643 12:02:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.644 12:02:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.644 12:02:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:33.644 12:02:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:33.644 12:02:46 -- accel/accel.sh@41 -- # local IFS=, 00:08:33.644 12:02:46 -- accel/accel.sh@42 -- # jq -r . 00:08:33.644 [2024-06-11 12:02:46.453003] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:33.644 [2024-06-11 12:02:46.453108] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2683731 ] 00:08:33.644 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.644 [2024-06-11 12:02:46.576090] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.644 [2024-06-11 12:02:46.625675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.020 12:02:47 -- accel/accel.sh@18 -- # out=' 00:08:35.020 SPDK Configuration: 00:08:35.020 Core mask: 0x1 00:08:35.020 00:08:35.020 Accel Perf Configuration: 00:08:35.020 Workload Type: copy_crc32c 00:08:35.020 CRC-32C seed: 0 00:08:35.020 Vector size: 4096 bytes 00:08:35.020 Transfer size: 8192 bytes 00:08:35.020 Vector count 2 00:08:35.020 Module: software 00:08:35.020 Queue depth: 32 00:08:35.020 Allocate depth: 32 00:08:35.020 # threads/core: 1 00:08:35.020 Run time: 1 seconds 00:08:35.020 Verify: Yes 00:08:35.020 00:08:35.020 Running for 1 seconds... 00:08:35.020 00:08:35.020 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:35.020 ------------------------------------------------------------------------------------ 00:08:35.020 0,0 188032/s 1469 MiB/s 0 0 00:08:35.020 ==================================================================================== 00:08:35.020 Total 188032/s 734 MiB/s 0 0' 00:08:35.020 12:02:47 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:47 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:35.020 12:02:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:35.020 12:02:47 -- accel/accel.sh@12 -- # build_accel_config 00:08:35.020 12:02:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:35.020 12:02:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.020 12:02:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.020 12:02:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:35.020 12:02:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:35.020 12:02:47 -- accel/accel.sh@41 -- # local IFS=, 00:08:35.020 12:02:47 -- accel/accel.sh@42 -- # jq -r . 00:08:35.020 [2024-06-11 12:02:47.840159] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:35.020 [2024-06-11 12:02:47.840247] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2683919 ] 00:08:35.020 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.020 [2024-06-11 12:02:47.955453] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.020 [2024-06-11 12:02:47.999503] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val=0x1 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val=copy_crc32c 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val=0 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.020 12:02:48 -- accel/accel.sh@21 -- # val='8192 bytes' 00:08:35.020 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.020 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.021 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.021 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.021 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val=software 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@23 -- # accel_module=software 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val=32 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val=32 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val=1 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val=Yes 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:35.279 12:02:48 -- accel/accel.sh@21 -- # val= 00:08:35.279 12:02:48 -- accel/accel.sh@22 -- # case "$var" in 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # IFS=: 00:08:35.279 12:02:48 -- accel/accel.sh@20 -- # read -r var val 00:08:36.216 12:02:49 -- accel/accel.sh@21 -- # val= 00:08:36.216 12:02:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.216 12:02:49 -- accel/accel.sh@20 -- # IFS=: 00:08:36.216 12:02:49 -- accel/accel.sh@20 -- # read -r var val 00:08:36.216 12:02:49 -- accel/accel.sh@21 -- # val= 00:08:36.216 12:02:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.216 12:02:49 -- accel/accel.sh@20 -- # IFS=: 00:08:36.216 12:02:49 -- accel/accel.sh@20 -- # read -r var val 00:08:36.216 12:02:49 -- accel/accel.sh@21 -- # val= 00:08:36.216 12:02:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.216 12:02:49 -- accel/accel.sh@20 -- # IFS=: 00:08:36.216 12:02:49 -- accel/accel.sh@20 -- # read -r var val 00:08:36.216 12:02:49 -- accel/accel.sh@21 -- # val= 00:08:36.217 12:02:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.217 12:02:49 -- accel/accel.sh@20 -- # IFS=: 00:08:36.217 12:02:49 -- accel/accel.sh@20 -- # read -r var val 00:08:36.217 12:02:49 -- accel/accel.sh@21 -- # val= 00:08:36.217 12:02:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.217 12:02:49 -- accel/accel.sh@20 -- # IFS=: 00:08:36.217 12:02:49 -- accel/accel.sh@20 -- # read -r var val 00:08:36.217 12:02:49 -- accel/accel.sh@21 -- # val= 00:08:36.217 12:02:49 -- accel/accel.sh@22 -- # case "$var" in 00:08:36.217 12:02:49 -- accel/accel.sh@20 -- # IFS=: 00:08:36.217 12:02:49 -- accel/accel.sh@20 -- # read -r var val 00:08:36.217 12:02:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:36.217 12:02:49 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:08:36.217 12:02:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:36.217 00:08:36.217 real 0m2.767s 00:08:36.217 user 0m2.430s 00:08:36.217 sys 0m0.340s 00:08:36.217 12:02:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.217 12:02:49 -- common/autotest_common.sh@10 -- # set +x 00:08:36.217 ************************************ 00:08:36.217 END TEST accel_copy_crc32c_C2 00:08:36.217 ************************************ 00:08:36.217 12:02:49 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:36.217 12:02:49 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:36.217 12:02:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:36.217 12:02:49 -- common/autotest_common.sh@10 -- # set +x 00:08:36.217 ************************************ 00:08:36.217 START TEST accel_dualcast 00:08:36.217 ************************************ 00:08:36.217 12:02:49 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:08:36.217 12:02:49 -- accel/accel.sh@16 -- # local accel_opc 00:08:36.217 12:02:49 -- accel/accel.sh@17 -- # local accel_module 00:08:36.217 12:02:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:08:36.476 12:02:49 -- accel/accel.sh@12 -- # build_accel_config 00:08:36.476 12:02:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:36.476 12:02:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:36.476 12:02:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.476 12:02:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.476 12:02:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:36.476 12:02:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:36.476 12:02:49 -- accel/accel.sh@41 -- # local IFS=, 00:08:36.476 12:02:49 -- accel/accel.sh@42 -- # jq -r . 00:08:36.476 [2024-06-11 12:02:49.266535] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:36.476 [2024-06-11 12:02:49.266629] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2684117 ] 00:08:36.476 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.476 [2024-06-11 12:02:49.385028] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.476 [2024-06-11 12:02:49.428978] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.853 12:02:50 -- accel/accel.sh@18 -- # out=' 00:08:37.853 SPDK Configuration: 00:08:37.853 Core mask: 0x1 00:08:37.853 00:08:37.853 Accel Perf Configuration: 00:08:37.853 Workload Type: dualcast 00:08:37.853 Transfer size: 4096 bytes 00:08:37.853 Vector count 1 00:08:37.853 Module: software 00:08:37.853 Queue depth: 32 00:08:37.853 Allocate depth: 32 00:08:37.853 # threads/core: 1 00:08:37.853 Run time: 1 seconds 00:08:37.853 Verify: Yes 00:08:37.853 00:08:37.853 Running for 1 seconds... 00:08:37.853 00:08:37.853 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:37.853 ------------------------------------------------------------------------------------ 00:08:37.853 0,0 415360/s 1622 MiB/s 0 0 00:08:37.853 ==================================================================================== 00:08:37.853 Total 415360/s 1622 MiB/s 0 0' 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:37.853 12:02:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:37.853 12:02:50 -- accel/accel.sh@12 -- # build_accel_config 00:08:37.853 12:02:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:37.853 12:02:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:37.853 12:02:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:37.853 12:02:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:37.853 12:02:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:37.853 12:02:50 -- accel/accel.sh@41 -- # local IFS=, 00:08:37.853 12:02:50 -- accel/accel.sh@42 -- # jq -r . 00:08:37.853 [2024-06-11 12:02:50.630771] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:37.853 [2024-06-11 12:02:50.630865] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2684295 ] 00:08:37.853 EAL: No free 2048 kB hugepages reported on node 1 00:08:37.853 [2024-06-11 12:02:50.748162] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.853 [2024-06-11 12:02:50.792860] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=0x1 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=dualcast 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=software 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@23 -- # accel_module=software 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=32 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=32 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=1 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val=Yes 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:37.853 12:02:50 -- accel/accel.sh@21 -- # val= 00:08:37.853 12:02:50 -- accel/accel.sh@22 -- # case "$var" in 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # IFS=: 00:08:37.853 12:02:50 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@21 -- # val= 00:08:39.229 12:02:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # IFS=: 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@21 -- # val= 00:08:39.229 12:02:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # IFS=: 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@21 -- # val= 00:08:39.229 12:02:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # IFS=: 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@21 -- # val= 00:08:39.229 12:02:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # IFS=: 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@21 -- # val= 00:08:39.229 12:02:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # IFS=: 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@21 -- # val= 00:08:39.229 12:02:51 -- accel/accel.sh@22 -- # case "$var" in 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # IFS=: 00:08:39.229 12:02:51 -- accel/accel.sh@20 -- # read -r var val 00:08:39.229 12:02:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:39.229 12:02:51 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:08:39.229 12:02:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:39.229 00:08:39.229 real 0m2.734s 00:08:39.229 user 0m2.382s 00:08:39.229 sys 0m0.356s 00:08:39.229 12:02:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.229 12:02:51 -- common/autotest_common.sh@10 -- # set +x 00:08:39.229 ************************************ 00:08:39.229 END TEST accel_dualcast 00:08:39.229 ************************************ 00:08:39.229 12:02:52 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:39.229 12:02:52 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:39.229 12:02:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:39.229 12:02:52 -- common/autotest_common.sh@10 -- # set +x 00:08:39.229 ************************************ 00:08:39.229 START TEST accel_compare 00:08:39.229 ************************************ 00:08:39.229 12:02:52 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:08:39.229 12:02:52 -- accel/accel.sh@16 -- # local accel_opc 00:08:39.229 12:02:52 -- accel/accel.sh@17 -- # local accel_module 00:08:39.229 12:02:52 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:08:39.229 12:02:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:39.229 12:02:52 -- accel/accel.sh@12 -- # build_accel_config 00:08:39.229 12:02:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:39.229 12:02:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.229 12:02:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.229 12:02:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:39.229 12:02:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:39.229 12:02:52 -- accel/accel.sh@41 -- # local IFS=, 00:08:39.229 12:02:52 -- accel/accel.sh@42 -- # jq -r . 00:08:39.229 [2024-06-11 12:02:52.047666] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:39.229 [2024-06-11 12:02:52.047747] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2684500 ] 00:08:39.229 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.229 [2024-06-11 12:02:52.166059] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.229 [2024-06-11 12:02:52.213651] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.606 12:02:53 -- accel/accel.sh@18 -- # out=' 00:08:40.606 SPDK Configuration: 00:08:40.606 Core mask: 0x1 00:08:40.606 00:08:40.606 Accel Perf Configuration: 00:08:40.606 Workload Type: compare 00:08:40.606 Transfer size: 4096 bytes 00:08:40.606 Vector count 1 00:08:40.606 Module: software 00:08:40.606 Queue depth: 32 00:08:40.606 Allocate depth: 32 00:08:40.606 # threads/core: 1 00:08:40.606 Run time: 1 seconds 00:08:40.606 Verify: Yes 00:08:40.606 00:08:40.606 Running for 1 seconds... 00:08:40.606 00:08:40.606 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:40.606 ------------------------------------------------------------------------------------ 00:08:40.606 0,0 491552/s 1920 MiB/s 0 0 00:08:40.606 ==================================================================================== 00:08:40.606 Total 491552/s 1920 MiB/s 0 0' 00:08:40.606 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.606 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.606 12:02:53 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:40.606 12:02:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:40.606 12:02:53 -- accel/accel.sh@12 -- # build_accel_config 00:08:40.606 12:02:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:40.606 12:02:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:40.606 12:02:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:40.606 12:02:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:40.606 12:02:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:40.606 12:02:53 -- accel/accel.sh@41 -- # local IFS=, 00:08:40.606 12:02:53 -- accel/accel.sh@42 -- # jq -r . 00:08:40.606 [2024-06-11 12:02:53.419340] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:40.606 [2024-06-11 12:02:53.419434] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2684678 ] 00:08:40.606 EAL: No free 2048 kB hugepages reported on node 1 00:08:40.606 [2024-06-11 12:02:53.538851] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.606 [2024-06-11 12:02:53.586439] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.606 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.606 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.606 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.606 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.606 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.606 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.606 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.606 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.865 12:02:53 -- accel/accel.sh@21 -- # val=0x1 00:08:40.865 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.865 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.865 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.865 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.865 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.865 12:02:53 -- accel/accel.sh@21 -- # val=compare 00:08:40.865 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.865 12:02:53 -- accel/accel.sh@24 -- # accel_opc=compare 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.865 12:02:53 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:40.865 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.865 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val=software 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@23 -- # accel_module=software 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val=32 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val=32 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val=1 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val=Yes 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:40.866 12:02:53 -- accel/accel.sh@21 -- # val= 00:08:40.866 12:02:53 -- accel/accel.sh@22 -- # case "$var" in 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # IFS=: 00:08:40.866 12:02:53 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@21 -- # val= 00:08:41.802 12:02:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # IFS=: 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@21 -- # val= 00:08:41.802 12:02:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # IFS=: 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@21 -- # val= 00:08:41.802 12:02:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # IFS=: 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@21 -- # val= 00:08:41.802 12:02:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # IFS=: 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@21 -- # val= 00:08:41.802 12:02:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # IFS=: 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@21 -- # val= 00:08:41.802 12:02:54 -- accel/accel.sh@22 -- # case "$var" in 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # IFS=: 00:08:41.802 12:02:54 -- accel/accel.sh@20 -- # read -r var val 00:08:41.802 12:02:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:41.802 12:02:54 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:08:41.802 12:02:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:41.802 00:08:41.802 real 0m2.756s 00:08:41.802 user 0m2.394s 00:08:41.802 sys 0m0.364s 00:08:41.802 12:02:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.802 12:02:54 -- common/autotest_common.sh@10 -- # set +x 00:08:41.802 ************************************ 00:08:41.802 END TEST accel_compare 00:08:41.802 ************************************ 00:08:41.802 12:02:54 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:41.802 12:02:54 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:08:41.802 12:02:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:41.802 12:02:54 -- common/autotest_common.sh@10 -- # set +x 00:08:41.802 ************************************ 00:08:41.802 START TEST accel_xor 00:08:41.802 ************************************ 00:08:41.802 12:02:54 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:08:41.802 12:02:54 -- accel/accel.sh@16 -- # local accel_opc 00:08:41.802 12:02:54 -- accel/accel.sh@17 -- # local accel_module 00:08:41.803 12:02:54 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:08:42.061 12:02:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:42.061 12:02:54 -- accel/accel.sh@12 -- # build_accel_config 00:08:42.061 12:02:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:42.061 12:02:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.061 12:02:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.061 12:02:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:42.061 12:02:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:42.061 12:02:54 -- accel/accel.sh@41 -- # local IFS=, 00:08:42.061 12:02:54 -- accel/accel.sh@42 -- # jq -r . 00:08:42.061 [2024-06-11 12:02:54.853464] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:42.061 [2024-06-11 12:02:54.853554] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2684879 ] 00:08:42.061 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.061 [2024-06-11 12:02:54.972728] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.061 [2024-06-11 12:02:55.020222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.438 12:02:56 -- accel/accel.sh@18 -- # out=' 00:08:43.438 SPDK Configuration: 00:08:43.438 Core mask: 0x1 00:08:43.438 00:08:43.438 Accel Perf Configuration: 00:08:43.438 Workload Type: xor 00:08:43.438 Source buffers: 2 00:08:43.438 Transfer size: 4096 bytes 00:08:43.438 Vector count 1 00:08:43.439 Module: software 00:08:43.439 Queue depth: 32 00:08:43.439 Allocate depth: 32 00:08:43.439 # threads/core: 1 00:08:43.439 Run time: 1 seconds 00:08:43.439 Verify: Yes 00:08:43.439 00:08:43.439 Running for 1 seconds... 00:08:43.439 00:08:43.439 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:43.439 ------------------------------------------------------------------------------------ 00:08:43.439 0,0 454240/s 1774 MiB/s 0 0 00:08:43.439 ==================================================================================== 00:08:43.439 Total 454240/s 1774 MiB/s 0 0' 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:43.439 12:02:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:43.439 12:02:56 -- accel/accel.sh@12 -- # build_accel_config 00:08:43.439 12:02:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:43.439 12:02:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:43.439 12:02:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:43.439 12:02:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:43.439 12:02:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:43.439 12:02:56 -- accel/accel.sh@41 -- # local IFS=, 00:08:43.439 12:02:56 -- accel/accel.sh@42 -- # jq -r . 00:08:43.439 [2024-06-11 12:02:56.241161] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:43.439 [2024-06-11 12:02:56.241251] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2685059 ] 00:08:43.439 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.439 [2024-06-11 12:02:56.360032] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.439 [2024-06-11 12:02:56.406939] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=0x1 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=xor 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@24 -- # accel_opc=xor 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=2 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=software 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@23 -- # accel_module=software 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=32 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=32 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val=1 00:08:43.439 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.439 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.439 12:02:56 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:43.699 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.699 12:02:56 -- accel/accel.sh@21 -- # val=Yes 00:08:43.699 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.699 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.699 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:43.699 12:02:56 -- accel/accel.sh@21 -- # val= 00:08:43.699 12:02:56 -- accel/accel.sh@22 -- # case "$var" in 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # IFS=: 00:08:43.699 12:02:56 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@21 -- # val= 00:08:44.687 12:02:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # IFS=: 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@21 -- # val= 00:08:44.687 12:02:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # IFS=: 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@21 -- # val= 00:08:44.687 12:02:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # IFS=: 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@21 -- # val= 00:08:44.687 12:02:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # IFS=: 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@21 -- # val= 00:08:44.687 12:02:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # IFS=: 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@21 -- # val= 00:08:44.687 12:02:57 -- accel/accel.sh@22 -- # case "$var" in 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # IFS=: 00:08:44.687 12:02:57 -- accel/accel.sh@20 -- # read -r var val 00:08:44.687 12:02:57 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:44.687 12:02:57 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:08:44.687 12:02:57 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:44.687 00:08:44.687 real 0m2.779s 00:08:44.687 user 0m2.417s 00:08:44.687 sys 0m0.365s 00:08:44.687 12:02:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.687 12:02:57 -- common/autotest_common.sh@10 -- # set +x 00:08:44.687 ************************************ 00:08:44.687 END TEST accel_xor 00:08:44.687 ************************************ 00:08:44.687 12:02:57 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:44.687 12:02:57 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:44.687 12:02:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:44.687 12:02:57 -- common/autotest_common.sh@10 -- # set +x 00:08:44.687 ************************************ 00:08:44.687 START TEST accel_xor 00:08:44.687 ************************************ 00:08:44.687 12:02:57 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:08:44.687 12:02:57 -- accel/accel.sh@16 -- # local accel_opc 00:08:44.687 12:02:57 -- accel/accel.sh@17 -- # local accel_module 00:08:44.687 12:02:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:08:44.687 12:02:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:44.687 12:02:57 -- accel/accel.sh@12 -- # build_accel_config 00:08:44.687 12:02:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:44.687 12:02:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:44.687 12:02:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:44.687 12:02:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:44.687 12:02:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:44.687 12:02:57 -- accel/accel.sh@41 -- # local IFS=, 00:08:44.687 12:02:57 -- accel/accel.sh@42 -- # jq -r . 00:08:44.687 [2024-06-11 12:02:57.683046] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:44.687 [2024-06-11 12:02:57.683148] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2685258 ] 00:08:44.947 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.947 [2024-06-11 12:02:57.807421] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.947 [2024-06-11 12:02:57.860140] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.324 12:02:59 -- accel/accel.sh@18 -- # out=' 00:08:46.324 SPDK Configuration: 00:08:46.324 Core mask: 0x1 00:08:46.324 00:08:46.324 Accel Perf Configuration: 00:08:46.324 Workload Type: xor 00:08:46.324 Source buffers: 3 00:08:46.324 Transfer size: 4096 bytes 00:08:46.324 Vector count 1 00:08:46.324 Module: software 00:08:46.324 Queue depth: 32 00:08:46.324 Allocate depth: 32 00:08:46.324 # threads/core: 1 00:08:46.324 Run time: 1 seconds 00:08:46.324 Verify: Yes 00:08:46.324 00:08:46.324 Running for 1 seconds... 00:08:46.324 00:08:46.324 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:46.324 ------------------------------------------------------------------------------------ 00:08:46.324 0,0 426752/s 1667 MiB/s 0 0 00:08:46.324 ==================================================================================== 00:08:46.324 Total 426752/s 1667 MiB/s 0 0' 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:46.324 12:02:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:46.324 12:02:59 -- accel/accel.sh@12 -- # build_accel_config 00:08:46.324 12:02:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:46.324 12:02:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:46.324 12:02:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:46.324 12:02:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:46.324 12:02:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:46.324 12:02:59 -- accel/accel.sh@41 -- # local IFS=, 00:08:46.324 12:02:59 -- accel/accel.sh@42 -- # jq -r . 00:08:46.324 [2024-06-11 12:02:59.083087] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:46.324 [2024-06-11 12:02:59.083222] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2685442 ] 00:08:46.324 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.324 [2024-06-11 12:02:59.204728] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.324 [2024-06-11 12:02:59.252370] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=0x1 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=xor 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@24 -- # accel_opc=xor 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=3 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=software 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@23 -- # accel_module=software 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=32 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=32 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val=1 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.324 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.324 12:02:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:46.324 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.325 12:02:59 -- accel/accel.sh@21 -- # val=Yes 00:08:46.325 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.325 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.325 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:46.325 12:02:59 -- accel/accel.sh@21 -- # val= 00:08:46.325 12:02:59 -- accel/accel.sh@22 -- # case "$var" in 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # IFS=: 00:08:46.325 12:02:59 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@21 -- # val= 00:08:47.704 12:03:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # IFS=: 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@21 -- # val= 00:08:47.704 12:03:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # IFS=: 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@21 -- # val= 00:08:47.704 12:03:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # IFS=: 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@21 -- # val= 00:08:47.704 12:03:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # IFS=: 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@21 -- # val= 00:08:47.704 12:03:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # IFS=: 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@21 -- # val= 00:08:47.704 12:03:00 -- accel/accel.sh@22 -- # case "$var" in 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # IFS=: 00:08:47.704 12:03:00 -- accel/accel.sh@20 -- # read -r var val 00:08:47.704 12:03:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:47.704 12:03:00 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:08:47.704 12:03:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:47.704 00:08:47.704 real 0m2.795s 00:08:47.704 user 0m2.406s 00:08:47.704 sys 0m0.392s 00:08:47.704 12:03:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.704 12:03:00 -- common/autotest_common.sh@10 -- # set +x 00:08:47.704 ************************************ 00:08:47.704 END TEST accel_xor 00:08:47.704 ************************************ 00:08:47.704 12:03:00 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:47.704 12:03:00 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:47.704 12:03:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:47.704 12:03:00 -- common/autotest_common.sh@10 -- # set +x 00:08:47.704 ************************************ 00:08:47.704 START TEST accel_dif_verify 00:08:47.704 ************************************ 00:08:47.704 12:03:00 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:08:47.704 12:03:00 -- accel/accel.sh@16 -- # local accel_opc 00:08:47.704 12:03:00 -- accel/accel.sh@17 -- # local accel_module 00:08:47.704 12:03:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:08:47.704 12:03:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:47.704 12:03:00 -- accel/accel.sh@12 -- # build_accel_config 00:08:47.704 12:03:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:47.704 12:03:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:47.704 12:03:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:47.704 12:03:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:47.704 12:03:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:47.704 12:03:00 -- accel/accel.sh@41 -- # local IFS=, 00:08:47.704 12:03:00 -- accel/accel.sh@42 -- # jq -r . 00:08:47.704 [2024-06-11 12:03:00.513511] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:47.704 [2024-06-11 12:03:00.513563] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2685675 ] 00:08:47.704 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.704 [2024-06-11 12:03:00.615964] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.704 [2024-06-11 12:03:00.663647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.081 12:03:01 -- accel/accel.sh@18 -- # out=' 00:08:49.081 SPDK Configuration: 00:08:49.081 Core mask: 0x1 00:08:49.081 00:08:49.081 Accel Perf Configuration: 00:08:49.081 Workload Type: dif_verify 00:08:49.081 Vector size: 4096 bytes 00:08:49.081 Transfer size: 4096 bytes 00:08:49.081 Block size: 512 bytes 00:08:49.081 Metadata size: 8 bytes 00:08:49.081 Vector count 1 00:08:49.081 Module: software 00:08:49.081 Queue depth: 32 00:08:49.081 Allocate depth: 32 00:08:49.081 # threads/core: 1 00:08:49.081 Run time: 1 seconds 00:08:49.081 Verify: No 00:08:49.081 00:08:49.081 Running for 1 seconds... 00:08:49.081 00:08:49.081 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:49.081 ------------------------------------------------------------------------------------ 00:08:49.081 0,0 148576/s 589 MiB/s 0 0 00:08:49.081 ==================================================================================== 00:08:49.081 Total 148576/s 580 MiB/s 0 0' 00:08:49.081 12:03:01 -- accel/accel.sh@20 -- # IFS=: 00:08:49.081 12:03:01 -- accel/accel.sh@20 -- # read -r var val 00:08:49.081 12:03:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:49.081 12:03:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:49.081 12:03:01 -- accel/accel.sh@12 -- # build_accel_config 00:08:49.081 12:03:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:49.081 12:03:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:49.081 12:03:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:49.081 12:03:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:49.081 12:03:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:49.081 12:03:01 -- accel/accel.sh@41 -- # local IFS=, 00:08:49.081 12:03:01 -- accel/accel.sh@42 -- # jq -r . 00:08:49.081 [2024-06-11 12:03:01.879571] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:49.081 [2024-06-11 12:03:01.879662] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2685935 ] 00:08:49.081 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.081 [2024-06-11 12:03:02.000617] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.081 [2024-06-11 12:03:02.047981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.081 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.081 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.081 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.081 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.081 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.081 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.081 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val=0x1 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val=dif_verify 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val='512 bytes' 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val='8 bytes' 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val=software 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@23 -- # accel_module=software 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val=32 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val=32 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val=1 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.082 12:03:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:49.082 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.082 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.340 12:03:02 -- accel/accel.sh@21 -- # val=No 00:08:49.340 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.340 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.340 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.340 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.340 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.340 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.340 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:49.340 12:03:02 -- accel/accel.sh@21 -- # val= 00:08:49.340 12:03:02 -- accel/accel.sh@22 -- # case "$var" in 00:08:49.340 12:03:02 -- accel/accel.sh@20 -- # IFS=: 00:08:49.340 12:03:02 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@21 -- # val= 00:08:50.276 12:03:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # IFS=: 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@21 -- # val= 00:08:50.276 12:03:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # IFS=: 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@21 -- # val= 00:08:50.276 12:03:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # IFS=: 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@21 -- # val= 00:08:50.276 12:03:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # IFS=: 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@21 -- # val= 00:08:50.276 12:03:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # IFS=: 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@21 -- # val= 00:08:50.276 12:03:03 -- accel/accel.sh@22 -- # case "$var" in 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # IFS=: 00:08:50.276 12:03:03 -- accel/accel.sh@20 -- # read -r var val 00:08:50.276 12:03:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:50.276 12:03:03 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:08:50.276 12:03:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:50.276 00:08:50.276 real 0m2.749s 00:08:50.276 user 0m2.402s 00:08:50.276 sys 0m0.351s 00:08:50.276 12:03:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:50.276 12:03:03 -- common/autotest_common.sh@10 -- # set +x 00:08:50.276 ************************************ 00:08:50.276 END TEST accel_dif_verify 00:08:50.276 ************************************ 00:08:50.276 12:03:03 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:50.276 12:03:03 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:50.276 12:03:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:50.276 12:03:03 -- common/autotest_common.sh@10 -- # set +x 00:08:50.276 ************************************ 00:08:50.276 START TEST accel_dif_generate 00:08:50.276 ************************************ 00:08:50.276 12:03:03 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:08:50.276 12:03:03 -- accel/accel.sh@16 -- # local accel_opc 00:08:50.276 12:03:03 -- accel/accel.sh@17 -- # local accel_module 00:08:50.276 12:03:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:08:50.276 12:03:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:50.276 12:03:03 -- accel/accel.sh@12 -- # build_accel_config 00:08:50.276 12:03:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:50.276 12:03:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:50.276 12:03:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:50.276 12:03:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:50.276 12:03:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:50.276 12:03:03 -- accel/accel.sh@41 -- # local IFS=, 00:08:50.276 12:03:03 -- accel/accel.sh@42 -- # jq -r . 00:08:50.535 [2024-06-11 12:03:03.316121] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:50.535 [2024-06-11 12:03:03.316212] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686170 ] 00:08:50.535 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.535 [2024-06-11 12:03:03.434925] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.535 [2024-06-11 12:03:03.482786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.909 12:03:04 -- accel/accel.sh@18 -- # out=' 00:08:51.909 SPDK Configuration: 00:08:51.909 Core mask: 0x1 00:08:51.909 00:08:51.909 Accel Perf Configuration: 00:08:51.909 Workload Type: dif_generate 00:08:51.909 Vector size: 4096 bytes 00:08:51.909 Transfer size: 4096 bytes 00:08:51.909 Block size: 512 bytes 00:08:51.909 Metadata size: 8 bytes 00:08:51.909 Vector count 1 00:08:51.909 Module: software 00:08:51.909 Queue depth: 32 00:08:51.909 Allocate depth: 32 00:08:51.909 # threads/core: 1 00:08:51.909 Run time: 1 seconds 00:08:51.909 Verify: No 00:08:51.909 00:08:51.909 Running for 1 seconds... 00:08:51.910 00:08:51.910 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:51.910 ------------------------------------------------------------------------------------ 00:08:51.910 0,0 178272/s 707 MiB/s 0 0 00:08:51.910 ==================================================================================== 00:08:51.910 Total 178272/s 696 MiB/s 0 0' 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:51.910 12:03:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:51.910 12:03:04 -- accel/accel.sh@12 -- # build_accel_config 00:08:51.910 12:03:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:51.910 12:03:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:51.910 12:03:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:51.910 12:03:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:51.910 12:03:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:51.910 12:03:04 -- accel/accel.sh@41 -- # local IFS=, 00:08:51.910 12:03:04 -- accel/accel.sh@42 -- # jq -r . 00:08:51.910 [2024-06-11 12:03:04.698869] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:51.910 [2024-06-11 12:03:04.698958] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686501 ] 00:08:51.910 EAL: No free 2048 kB hugepages reported on node 1 00:08:51.910 [2024-06-11 12:03:04.818582] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.910 [2024-06-11 12:03:04.865940] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=0x1 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=dif_generate 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val='512 bytes' 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val='8 bytes' 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=software 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@23 -- # accel_module=software 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=32 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=32 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=1 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val=No 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:51.910 12:03:04 -- accel/accel.sh@21 -- # val= 00:08:51.910 12:03:04 -- accel/accel.sh@22 -- # case "$var" in 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # IFS=: 00:08:51.910 12:03:04 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@21 -- # val= 00:08:53.287 12:03:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # IFS=: 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@21 -- # val= 00:08:53.287 12:03:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # IFS=: 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@21 -- # val= 00:08:53.287 12:03:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # IFS=: 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@21 -- # val= 00:08:53.287 12:03:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # IFS=: 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@21 -- # val= 00:08:53.287 12:03:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # IFS=: 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@21 -- # val= 00:08:53.287 12:03:06 -- accel/accel.sh@22 -- # case "$var" in 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # IFS=: 00:08:53.287 12:03:06 -- accel/accel.sh@20 -- # read -r var val 00:08:53.287 12:03:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:53.287 12:03:06 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:08:53.287 12:03:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:53.287 00:08:53.287 real 0m2.765s 00:08:53.287 user 0m2.420s 00:08:53.287 sys 0m0.350s 00:08:53.287 12:03:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.287 12:03:06 -- common/autotest_common.sh@10 -- # set +x 00:08:53.287 ************************************ 00:08:53.287 END TEST accel_dif_generate 00:08:53.287 ************************************ 00:08:53.287 12:03:06 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:53.287 12:03:06 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:08:53.287 12:03:06 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:53.287 12:03:06 -- common/autotest_common.sh@10 -- # set +x 00:08:53.287 ************************************ 00:08:53.287 START TEST accel_dif_generate_copy 00:08:53.287 ************************************ 00:08:53.287 12:03:06 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:08:53.287 12:03:06 -- accel/accel.sh@16 -- # local accel_opc 00:08:53.287 12:03:06 -- accel/accel.sh@17 -- # local accel_module 00:08:53.287 12:03:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:08:53.287 12:03:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:53.287 12:03:06 -- accel/accel.sh@12 -- # build_accel_config 00:08:53.287 12:03:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:53.287 12:03:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.287 12:03:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.287 12:03:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:53.287 12:03:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:53.287 12:03:06 -- accel/accel.sh@41 -- # local IFS=, 00:08:53.287 12:03:06 -- accel/accel.sh@42 -- # jq -r . 00:08:53.287 [2024-06-11 12:03:06.129849] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:53.287 [2024-06-11 12:03:06.129931] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686978 ] 00:08:53.287 EAL: No free 2048 kB hugepages reported on node 1 00:08:53.287 [2024-06-11 12:03:06.247861] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.288 [2024-06-11 12:03:06.292871] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.665 12:03:07 -- accel/accel.sh@18 -- # out=' 00:08:54.665 SPDK Configuration: 00:08:54.665 Core mask: 0x1 00:08:54.665 00:08:54.665 Accel Perf Configuration: 00:08:54.665 Workload Type: dif_generate_copy 00:08:54.665 Vector size: 4096 bytes 00:08:54.665 Transfer size: 4096 bytes 00:08:54.665 Vector count 1 00:08:54.665 Module: software 00:08:54.665 Queue depth: 32 00:08:54.665 Allocate depth: 32 00:08:54.665 # threads/core: 1 00:08:54.665 Run time: 1 seconds 00:08:54.665 Verify: No 00:08:54.665 00:08:54.665 Running for 1 seconds... 00:08:54.665 00:08:54.665 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:54.665 ------------------------------------------------------------------------------------ 00:08:54.665 0,0 136320/s 540 MiB/s 0 0 00:08:54.665 ==================================================================================== 00:08:54.665 Total 136320/s 532 MiB/s 0 0' 00:08:54.665 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.665 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.665 12:03:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:54.665 12:03:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:54.665 12:03:07 -- accel/accel.sh@12 -- # build_accel_config 00:08:54.665 12:03:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:54.665 12:03:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:54.665 12:03:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:54.665 12:03:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:54.665 12:03:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:54.665 12:03:07 -- accel/accel.sh@41 -- # local IFS=, 00:08:54.665 12:03:07 -- accel/accel.sh@42 -- # jq -r . 00:08:54.665 [2024-06-11 12:03:07.503094] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:54.665 [2024-06-11 12:03:07.503184] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687191 ] 00:08:54.665 EAL: No free 2048 kB hugepages reported on node 1 00:08:54.665 [2024-06-11 12:03:07.622540] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.665 [2024-06-11 12:03:07.666099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=0x1 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=software 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@23 -- # accel_module=software 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=32 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=32 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=1 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val=No 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:54.924 12:03:07 -- accel/accel.sh@21 -- # val= 00:08:54.924 12:03:07 -- accel/accel.sh@22 -- # case "$var" in 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # IFS=: 00:08:54.924 12:03:07 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@21 -- # val= 00:08:55.860 12:03:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # IFS=: 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@21 -- # val= 00:08:55.860 12:03:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # IFS=: 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@21 -- # val= 00:08:55.860 12:03:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # IFS=: 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@21 -- # val= 00:08:55.860 12:03:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # IFS=: 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@21 -- # val= 00:08:55.860 12:03:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # IFS=: 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@21 -- # val= 00:08:55.860 12:03:08 -- accel/accel.sh@22 -- # case "$var" in 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # IFS=: 00:08:55.860 12:03:08 -- accel/accel.sh@20 -- # read -r var val 00:08:55.860 12:03:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:55.860 12:03:08 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:08:55.860 12:03:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:55.860 00:08:55.860 real 0m2.748s 00:08:55.860 user 0m2.409s 00:08:55.860 sys 0m0.343s 00:08:55.860 12:03:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.860 12:03:08 -- common/autotest_common.sh@10 -- # set +x 00:08:55.860 ************************************ 00:08:55.860 END TEST accel_dif_generate_copy 00:08:55.860 ************************************ 00:08:56.120 12:03:08 -- accel/accel.sh@107 -- # [[ y == y ]] 00:08:56.120 12:03:08 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.120 12:03:08 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:08:56.120 12:03:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:56.120 12:03:08 -- common/autotest_common.sh@10 -- # set +x 00:08:56.120 ************************************ 00:08:56.120 START TEST accel_comp 00:08:56.120 ************************************ 00:08:56.120 12:03:08 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.120 12:03:08 -- accel/accel.sh@16 -- # local accel_opc 00:08:56.120 12:03:08 -- accel/accel.sh@17 -- # local accel_module 00:08:56.120 12:03:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.120 12:03:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:56.120 12:03:08 -- accel/accel.sh@12 -- # build_accel_config 00:08:56.120 12:03:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:56.120 12:03:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:56.120 12:03:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:56.120 12:03:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:56.120 12:03:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:56.120 12:03:08 -- accel/accel.sh@41 -- # local IFS=, 00:08:56.120 12:03:08 -- accel/accel.sh@42 -- # jq -r . 00:08:56.120 [2024-06-11 12:03:08.926627] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:56.120 [2024-06-11 12:03:08.926715] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687418 ] 00:08:56.120 EAL: No free 2048 kB hugepages reported on node 1 00:08:56.120 [2024-06-11 12:03:09.045320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.120 [2024-06-11 12:03:09.090828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.496 12:03:10 -- accel/accel.sh@18 -- # out='Preparing input file... 00:08:57.496 00:08:57.496 SPDK Configuration: 00:08:57.496 Core mask: 0x1 00:08:57.496 00:08:57.496 Accel Perf Configuration: 00:08:57.496 Workload Type: compress 00:08:57.496 Transfer size: 4096 bytes 00:08:57.497 Vector count 1 00:08:57.497 Module: software 00:08:57.497 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.497 Queue depth: 32 00:08:57.497 Allocate depth: 32 00:08:57.497 # threads/core: 1 00:08:57.497 Run time: 1 seconds 00:08:57.497 Verify: No 00:08:57.497 00:08:57.497 Running for 1 seconds... 00:08:57.497 00:08:57.497 Core,Thread Transfers Bandwidth Failed Miscompares 00:08:57.497 ------------------------------------------------------------------------------------ 00:08:57.497 0,0 44416/s 185 MiB/s 0 0 00:08:57.497 ==================================================================================== 00:08:57.497 Total 44416/s 173 MiB/s 0 0' 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.497 12:03:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.497 12:03:10 -- accel/accel.sh@12 -- # build_accel_config 00:08:57.497 12:03:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:57.497 12:03:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:57.497 12:03:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:57.497 12:03:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:57.497 12:03:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:57.497 12:03:10 -- accel/accel.sh@41 -- # local IFS=, 00:08:57.497 12:03:10 -- accel/accel.sh@42 -- # jq -r . 00:08:57.497 [2024-06-11 12:03:10.296508] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:57.497 [2024-06-11 12:03:10.296588] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687619 ] 00:08:57.497 EAL: No free 2048 kB hugepages reported on node 1 00:08:57.497 [2024-06-11 12:03:10.413476] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.497 [2024-06-11 12:03:10.458823] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=0x1 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=compress 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@24 -- # accel_opc=compress 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=software 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@23 -- # accel_module=software 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=32 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=32 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=1 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val=No 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.497 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.497 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:57.497 12:03:10 -- accel/accel.sh@21 -- # val= 00:08:57.756 12:03:10 -- accel/accel.sh@22 -- # case "$var" in 00:08:57.756 12:03:10 -- accel/accel.sh@20 -- # IFS=: 00:08:57.756 12:03:10 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@21 -- # val= 00:08:58.691 12:03:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # IFS=: 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@21 -- # val= 00:08:58.691 12:03:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # IFS=: 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@21 -- # val= 00:08:58.691 12:03:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # IFS=: 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@21 -- # val= 00:08:58.691 12:03:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # IFS=: 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@21 -- # val= 00:08:58.691 12:03:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # IFS=: 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@21 -- # val= 00:08:58.691 12:03:11 -- accel/accel.sh@22 -- # case "$var" in 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # IFS=: 00:08:58.691 12:03:11 -- accel/accel.sh@20 -- # read -r var val 00:08:58.691 12:03:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:08:58.691 12:03:11 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:08:58.691 12:03:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:58.691 00:08:58.691 real 0m2.744s 00:08:58.691 user 0m2.392s 00:08:58.691 sys 0m0.358s 00:08:58.691 12:03:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:58.691 12:03:11 -- common/autotest_common.sh@10 -- # set +x 00:08:58.691 ************************************ 00:08:58.691 END TEST accel_comp 00:08:58.691 ************************************ 00:08:58.691 12:03:11 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.691 12:03:11 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:08:58.691 12:03:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:58.691 12:03:11 -- common/autotest_common.sh@10 -- # set +x 00:08:58.691 ************************************ 00:08:58.691 START TEST accel_decomp 00:08:58.691 ************************************ 00:08:58.691 12:03:11 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.691 12:03:11 -- accel/accel.sh@16 -- # local accel_opc 00:08:58.691 12:03:11 -- accel/accel.sh@17 -- # local accel_module 00:08:58.691 12:03:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.691 12:03:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:58.691 12:03:11 -- accel/accel.sh@12 -- # build_accel_config 00:08:58.691 12:03:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:08:58.691 12:03:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:58.692 12:03:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:58.692 12:03:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:08:58.692 12:03:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:08:58.692 12:03:11 -- accel/accel.sh@41 -- # local IFS=, 00:08:58.692 12:03:11 -- accel/accel.sh@42 -- # jq -r . 00:08:58.692 [2024-06-11 12:03:11.714167] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:08:58.692 [2024-06-11 12:03:11.714260] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687841 ] 00:08:58.951 EAL: No free 2048 kB hugepages reported on node 1 00:08:58.951 [2024-06-11 12:03:11.833314] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.951 [2024-06-11 12:03:11.880914] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.325 12:03:13 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:00.325 00:09:00.325 SPDK Configuration: 00:09:00.325 Core mask: 0x1 00:09:00.325 00:09:00.325 Accel Perf Configuration: 00:09:00.325 Workload Type: decompress 00:09:00.325 Transfer size: 4096 bytes 00:09:00.325 Vector count 1 00:09:00.325 Module: software 00:09:00.325 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:00.325 Queue depth: 32 00:09:00.325 Allocate depth: 32 00:09:00.325 # threads/core: 1 00:09:00.325 Run time: 1 seconds 00:09:00.325 Verify: Yes 00:09:00.325 00:09:00.325 Running for 1 seconds... 00:09:00.325 00:09:00.325 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:00.325 ------------------------------------------------------------------------------------ 00:09:00.325 0,0 60800/s 112 MiB/s 0 0 00:09:00.325 ==================================================================================== 00:09:00.325 Total 60800/s 237 MiB/s 0 0' 00:09:00.325 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.325 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.325 12:03:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:09:00.325 12:03:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:09:00.325 12:03:13 -- accel/accel.sh@12 -- # build_accel_config 00:09:00.325 12:03:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:00.325 12:03:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:00.325 12:03:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:00.325 12:03:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:00.326 12:03:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:00.326 12:03:13 -- accel/accel.sh@41 -- # local IFS=, 00:09:00.326 12:03:13 -- accel/accel.sh@42 -- # jq -r . 00:09:00.326 [2024-06-11 12:03:13.091256] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:00.326 [2024-06-11 12:03:13.091341] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688021 ] 00:09:00.326 EAL: No free 2048 kB hugepages reported on node 1 00:09:00.326 [2024-06-11 12:03:13.211081] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.326 [2024-06-11 12:03:13.258555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=0x1 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=decompress 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=software 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@23 -- # accel_module=software 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=32 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=32 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=1 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val=Yes 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:00.326 12:03:13 -- accel/accel.sh@21 -- # val= 00:09:00.326 12:03:13 -- accel/accel.sh@22 -- # case "$var" in 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # IFS=: 00:09:00.326 12:03:13 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@21 -- # val= 00:09:01.701 12:03:14 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # IFS=: 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@21 -- # val= 00:09:01.701 12:03:14 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # IFS=: 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@21 -- # val= 00:09:01.701 12:03:14 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # IFS=: 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@21 -- # val= 00:09:01.701 12:03:14 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # IFS=: 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@21 -- # val= 00:09:01.701 12:03:14 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # IFS=: 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@21 -- # val= 00:09:01.701 12:03:14 -- accel/accel.sh@22 -- # case "$var" in 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # IFS=: 00:09:01.701 12:03:14 -- accel/accel.sh@20 -- # read -r var val 00:09:01.701 12:03:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:01.701 12:03:14 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:01.701 12:03:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:01.701 00:09:01.701 real 0m2.766s 00:09:01.701 user 0m2.404s 00:09:01.701 sys 0m0.366s 00:09:01.701 12:03:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.701 12:03:14 -- common/autotest_common.sh@10 -- # set +x 00:09:01.701 ************************************ 00:09:01.701 END TEST accel_decomp 00:09:01.701 ************************************ 00:09:01.701 12:03:14 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.701 12:03:14 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:09:01.701 12:03:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:01.701 12:03:14 -- common/autotest_common.sh@10 -- # set +x 00:09:01.701 ************************************ 00:09:01.701 START TEST accel_decmop_full 00:09:01.701 ************************************ 00:09:01.701 12:03:14 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.701 12:03:14 -- accel/accel.sh@16 -- # local accel_opc 00:09:01.701 12:03:14 -- accel/accel.sh@17 -- # local accel_module 00:09:01.701 12:03:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.701 12:03:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:01.701 12:03:14 -- accel/accel.sh@12 -- # build_accel_config 00:09:01.701 12:03:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:01.701 12:03:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:01.701 12:03:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:01.701 12:03:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:01.701 12:03:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:01.701 12:03:14 -- accel/accel.sh@41 -- # local IFS=, 00:09:01.701 12:03:14 -- accel/accel.sh@42 -- # jq -r . 00:09:01.701 [2024-06-11 12:03:14.528201] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:01.701 [2024-06-11 12:03:14.528293] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688220 ] 00:09:01.701 EAL: No free 2048 kB hugepages reported on node 1 00:09:01.701 [2024-06-11 12:03:14.650415] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.701 [2024-06-11 12:03:14.697774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.076 12:03:15 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:03.076 00:09:03.076 SPDK Configuration: 00:09:03.076 Core mask: 0x1 00:09:03.076 00:09:03.076 Accel Perf Configuration: 00:09:03.076 Workload Type: decompress 00:09:03.076 Transfer size: 111250 bytes 00:09:03.076 Vector count 1 00:09:03.076 Module: software 00:09:03.076 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:03.076 Queue depth: 32 00:09:03.076 Allocate depth: 32 00:09:03.076 # threads/core: 1 00:09:03.076 Run time: 1 seconds 00:09:03.076 Verify: Yes 00:09:03.076 00:09:03.076 Running for 1 seconds... 00:09:03.076 00:09:03.076 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:03.076 ------------------------------------------------------------------------------------ 00:09:03.076 0,0 3872/s 159 MiB/s 0 0 00:09:03.076 ==================================================================================== 00:09:03.076 Total 3872/s 410 MiB/s 0 0' 00:09:03.076 12:03:15 -- accel/accel.sh@20 -- # IFS=: 00:09:03.076 12:03:15 -- accel/accel.sh@20 -- # read -r var val 00:09:03.076 12:03:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:03.076 12:03:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:03.076 12:03:15 -- accel/accel.sh@12 -- # build_accel_config 00:09:03.076 12:03:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:03.076 12:03:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:03.076 12:03:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:03.076 12:03:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:03.076 12:03:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:03.076 12:03:15 -- accel/accel.sh@41 -- # local IFS=, 00:09:03.076 12:03:15 -- accel/accel.sh@42 -- # jq -r . 00:09:03.076 [2024-06-11 12:03:15.931547] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:03.076 [2024-06-11 12:03:15.931634] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688403 ] 00:09:03.076 EAL: No free 2048 kB hugepages reported on node 1 00:09:03.076 [2024-06-11 12:03:16.051462] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.076 [2024-06-11 12:03:16.099334] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=0x1 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=decompress 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val='111250 bytes' 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=software 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@23 -- # accel_module=software 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=32 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=32 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=1 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val=Yes 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:03.335 12:03:16 -- accel/accel.sh@21 -- # val= 00:09:03.335 12:03:16 -- accel/accel.sh@22 -- # case "$var" in 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # IFS=: 00:09:03.335 12:03:16 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@21 -- # val= 00:09:04.709 12:03:17 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # IFS=: 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@21 -- # val= 00:09:04.709 12:03:17 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # IFS=: 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@21 -- # val= 00:09:04.709 12:03:17 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # IFS=: 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@21 -- # val= 00:09:04.709 12:03:17 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # IFS=: 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@21 -- # val= 00:09:04.709 12:03:17 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # IFS=: 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@21 -- # val= 00:09:04.709 12:03:17 -- accel/accel.sh@22 -- # case "$var" in 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # IFS=: 00:09:04.709 12:03:17 -- accel/accel.sh@20 -- # read -r var val 00:09:04.709 12:03:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:04.709 12:03:17 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:04.709 12:03:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:04.709 00:09:04.709 real 0m2.813s 00:09:04.709 user 0m2.449s 00:09:04.709 sys 0m0.369s 00:09:04.709 12:03:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.709 12:03:17 -- common/autotest_common.sh@10 -- # set +x 00:09:04.709 ************************************ 00:09:04.709 END TEST accel_decmop_full 00:09:04.709 ************************************ 00:09:04.709 12:03:17 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.709 12:03:17 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:09:04.709 12:03:17 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:04.709 12:03:17 -- common/autotest_common.sh@10 -- # set +x 00:09:04.709 ************************************ 00:09:04.709 START TEST accel_decomp_mcore 00:09:04.709 ************************************ 00:09:04.709 12:03:17 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.709 12:03:17 -- accel/accel.sh@16 -- # local accel_opc 00:09:04.709 12:03:17 -- accel/accel.sh@17 -- # local accel_module 00:09:04.709 12:03:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.709 12:03:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:04.709 12:03:17 -- accel/accel.sh@12 -- # build_accel_config 00:09:04.709 12:03:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:04.709 12:03:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:04.709 12:03:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:04.709 12:03:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:04.709 12:03:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:04.709 12:03:17 -- accel/accel.sh@41 -- # local IFS=, 00:09:04.709 12:03:17 -- accel/accel.sh@42 -- # jq -r . 00:09:04.709 [2024-06-11 12:03:17.386823] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:04.709 [2024-06-11 12:03:17.386912] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688602 ] 00:09:04.709 EAL: No free 2048 kB hugepages reported on node 1 00:09:04.709 [2024-06-11 12:03:17.507343] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:04.709 [2024-06-11 12:03:17.558357] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:04.709 [2024-06-11 12:03:17.558459] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:04.709 [2024-06-11 12:03:17.558559] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:04.709 [2024-06-11 12:03:17.558567] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.086 12:03:18 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:06.086 00:09:06.086 SPDK Configuration: 00:09:06.086 Core mask: 0xf 00:09:06.086 00:09:06.086 Accel Perf Configuration: 00:09:06.086 Workload Type: decompress 00:09:06.086 Transfer size: 4096 bytes 00:09:06.086 Vector count 1 00:09:06.086 Module: software 00:09:06.086 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:06.086 Queue depth: 32 00:09:06.086 Allocate depth: 32 00:09:06.086 # threads/core: 1 00:09:06.086 Run time: 1 seconds 00:09:06.086 Verify: Yes 00:09:06.086 00:09:06.086 Running for 1 seconds... 00:09:06.086 00:09:06.086 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:06.087 ------------------------------------------------------------------------------------ 00:09:06.087 0,0 54336/s 100 MiB/s 0 0 00:09:06.087 3,0 54592/s 100 MiB/s 0 0 00:09:06.087 2,0 76288/s 140 MiB/s 0 0 00:09:06.087 1,0 54720/s 100 MiB/s 0 0 00:09:06.087 ==================================================================================== 00:09:06.087 Total 239936/s 937 MiB/s 0 0' 00:09:06.087 12:03:18 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:18 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:06.087 12:03:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:06.087 12:03:18 -- accel/accel.sh@12 -- # build_accel_config 00:09:06.087 12:03:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:06.087 12:03:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:06.087 12:03:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:06.087 12:03:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:06.087 12:03:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:06.087 12:03:18 -- accel/accel.sh@41 -- # local IFS=, 00:09:06.087 12:03:18 -- accel/accel.sh@42 -- # jq -r . 00:09:06.087 [2024-06-11 12:03:18.788998] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:06.087 [2024-06-11 12:03:18.789086] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688785 ] 00:09:06.087 EAL: No free 2048 kB hugepages reported on node 1 00:09:06.087 [2024-06-11 12:03:18.908539] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:06.087 [2024-06-11 12:03:18.959324] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:06.087 [2024-06-11 12:03:18.959418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:06.087 [2024-06-11 12:03:18.959470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.087 [2024-06-11 12:03:18.959469] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=0xf 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=decompress 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=software 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@23 -- # accel_module=software 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=32 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=32 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=1 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val=Yes 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:06.087 12:03:19 -- accel/accel.sh@21 -- # val= 00:09:06.087 12:03:19 -- accel/accel.sh@22 -- # case "$var" in 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # IFS=: 00:09:06.087 12:03:19 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@21 -- # val= 00:09:07.460 12:03:20 -- accel/accel.sh@22 -- # case "$var" in 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # IFS=: 00:09:07.460 12:03:20 -- accel/accel.sh@20 -- # read -r var val 00:09:07.460 12:03:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:07.460 12:03:20 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:07.460 12:03:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:07.460 00:09:07.460 real 0m2.801s 00:09:07.460 user 0m9.202s 00:09:07.460 sys 0m0.381s 00:09:07.460 12:03:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.460 12:03:20 -- common/autotest_common.sh@10 -- # set +x 00:09:07.460 ************************************ 00:09:07.460 END TEST accel_decomp_mcore 00:09:07.460 ************************************ 00:09:07.460 12:03:20 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.460 12:03:20 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:09:07.460 12:03:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:07.460 12:03:20 -- common/autotest_common.sh@10 -- # set +x 00:09:07.460 ************************************ 00:09:07.460 START TEST accel_decomp_full_mcore 00:09:07.460 ************************************ 00:09:07.460 12:03:20 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.460 12:03:20 -- accel/accel.sh@16 -- # local accel_opc 00:09:07.460 12:03:20 -- accel/accel.sh@17 -- # local accel_module 00:09:07.460 12:03:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.460 12:03:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:07.460 12:03:20 -- accel/accel.sh@12 -- # build_accel_config 00:09:07.460 12:03:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:07.460 12:03:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.460 12:03:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.460 12:03:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:07.460 12:03:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:07.460 12:03:20 -- accel/accel.sh@41 -- # local IFS=, 00:09:07.460 12:03:20 -- accel/accel.sh@42 -- # jq -r . 00:09:07.460 [2024-06-11 12:03:20.230529] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:07.460 [2024-06-11 12:03:20.230620] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688984 ] 00:09:07.460 EAL: No free 2048 kB hugepages reported on node 1 00:09:07.460 [2024-06-11 12:03:20.350400] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:07.460 [2024-06-11 12:03:20.402315] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:07.460 [2024-06-11 12:03:20.402400] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:07.460 [2024-06-11 12:03:20.402488] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:07.460 [2024-06-11 12:03:20.402489] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.836 12:03:21 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:08.836 00:09:08.836 SPDK Configuration: 00:09:08.836 Core mask: 0xf 00:09:08.836 00:09:08.836 Accel Perf Configuration: 00:09:08.836 Workload Type: decompress 00:09:08.836 Transfer size: 111250 bytes 00:09:08.836 Vector count 1 00:09:08.836 Module: software 00:09:08.836 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:08.836 Queue depth: 32 00:09:08.836 Allocate depth: 32 00:09:08.836 # threads/core: 1 00:09:08.836 Run time: 1 seconds 00:09:08.836 Verify: Yes 00:09:08.836 00:09:08.836 Running for 1 seconds... 00:09:08.836 00:09:08.836 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:08.836 ------------------------------------------------------------------------------------ 00:09:08.836 0,0 3840/s 158 MiB/s 0 0 00:09:08.836 3,0 3840/s 158 MiB/s 0 0 00:09:08.836 2,0 5632/s 232 MiB/s 0 0 00:09:08.836 1,0 3840/s 158 MiB/s 0 0 00:09:08.836 ==================================================================================== 00:09:08.836 Total 17152/s 1819 MiB/s 0 0' 00:09:08.836 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:08.836 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:08.836 12:03:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:08.836 12:03:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:08.836 12:03:21 -- accel/accel.sh@12 -- # build_accel_config 00:09:08.836 12:03:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:08.836 12:03:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:08.836 12:03:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:08.836 12:03:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:08.836 12:03:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:08.836 12:03:21 -- accel/accel.sh@41 -- # local IFS=, 00:09:08.836 12:03:21 -- accel/accel.sh@42 -- # jq -r . 00:09:08.836 [2024-06-11 12:03:21.644068] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:08.836 [2024-06-11 12:03:21.644156] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2689173 ] 00:09:08.836 EAL: No free 2048 kB hugepages reported on node 1 00:09:08.836 [2024-06-11 12:03:21.763317] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:08.836 [2024-06-11 12:03:21.814092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:08.836 [2024-06-11 12:03:21.814178] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:08.836 [2024-06-11 12:03:21.814282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.836 [2024-06-11 12:03:21.814282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=0xf 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=decompress 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val='111250 bytes' 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=software 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@23 -- # accel_module=software 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=32 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=32 00:09:09.095 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.095 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.095 12:03:21 -- accel/accel.sh@21 -- # val=1 00:09:09.096 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.096 12:03:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:09.096 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.096 12:03:21 -- accel/accel.sh@21 -- # val=Yes 00:09:09.096 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.096 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.096 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:09.096 12:03:21 -- accel/accel.sh@21 -- # val= 00:09:09.096 12:03:21 -- accel/accel.sh@22 -- # case "$var" in 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # IFS=: 00:09:09.096 12:03:21 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@21 -- # val= 00:09:10.031 12:03:23 -- accel/accel.sh@22 -- # case "$var" in 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # IFS=: 00:09:10.031 12:03:23 -- accel/accel.sh@20 -- # read -r var val 00:09:10.031 12:03:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:10.031 12:03:23 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:10.031 12:03:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:10.031 00:09:10.031 real 0m2.836s 00:09:10.031 user 0m9.318s 00:09:10.031 sys 0m0.392s 00:09:10.031 12:03:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.031 12:03:23 -- common/autotest_common.sh@10 -- # set +x 00:09:10.031 ************************************ 00:09:10.031 END TEST accel_decomp_full_mcore 00:09:10.031 ************************************ 00:09:10.290 12:03:23 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.290 12:03:23 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:09:10.290 12:03:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:10.290 12:03:23 -- common/autotest_common.sh@10 -- # set +x 00:09:10.290 ************************************ 00:09:10.290 START TEST accel_decomp_mthread 00:09:10.290 ************************************ 00:09:10.290 12:03:23 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.290 12:03:23 -- accel/accel.sh@16 -- # local accel_opc 00:09:10.290 12:03:23 -- accel/accel.sh@17 -- # local accel_module 00:09:10.290 12:03:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.290 12:03:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:10.290 12:03:23 -- accel/accel.sh@12 -- # build_accel_config 00:09:10.290 12:03:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:10.290 12:03:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:10.290 12:03:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:10.290 12:03:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:10.290 12:03:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:10.290 12:03:23 -- accel/accel.sh@41 -- # local IFS=, 00:09:10.290 12:03:23 -- accel/accel.sh@42 -- # jq -r . 00:09:10.290 [2024-06-11 12:03:23.116898] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:10.290 [2024-06-11 12:03:23.117008] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2689370 ] 00:09:10.290 EAL: No free 2048 kB hugepages reported on node 1 00:09:10.290 [2024-06-11 12:03:23.239449] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.290 [2024-06-11 12:03:23.291148] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.666 12:03:24 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:11.666 00:09:11.666 SPDK Configuration: 00:09:11.666 Core mask: 0x1 00:09:11.666 00:09:11.666 Accel Perf Configuration: 00:09:11.666 Workload Type: decompress 00:09:11.666 Transfer size: 4096 bytes 00:09:11.666 Vector count 1 00:09:11.666 Module: software 00:09:11.666 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:11.666 Queue depth: 32 00:09:11.666 Allocate depth: 32 00:09:11.666 # threads/core: 2 00:09:11.666 Run time: 1 seconds 00:09:11.666 Verify: Yes 00:09:11.666 00:09:11.666 Running for 1 seconds... 00:09:11.666 00:09:11.666 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:11.666 ------------------------------------------------------------------------------------ 00:09:11.666 0,1 30816/s 56 MiB/s 0 0 00:09:11.666 0,0 30688/s 56 MiB/s 0 0 00:09:11.666 ==================================================================================== 00:09:11.666 Total 61504/s 240 MiB/s 0 0' 00:09:11.666 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.666 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.666 12:03:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:11.666 12:03:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:11.666 12:03:24 -- accel/accel.sh@12 -- # build_accel_config 00:09:11.666 12:03:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:11.666 12:03:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:11.666 12:03:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:11.666 12:03:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:11.666 12:03:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:11.666 12:03:24 -- accel/accel.sh@41 -- # local IFS=, 00:09:11.666 12:03:24 -- accel/accel.sh@42 -- # jq -r . 00:09:11.666 [2024-06-11 12:03:24.514174] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:11.666 [2024-06-11 12:03:24.514263] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2689559 ] 00:09:11.666 EAL: No free 2048 kB hugepages reported on node 1 00:09:11.666 [2024-06-11 12:03:24.634537] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.666 [2024-06-11 12:03:24.679337] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=0x1 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=decompress 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=software 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@23 -- # accel_module=software 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=32 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=32 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=2 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val=Yes 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:11.925 12:03:24 -- accel/accel.sh@21 -- # val= 00:09:11.925 12:03:24 -- accel/accel.sh@22 -- # case "$var" in 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # IFS=: 00:09:11.925 12:03:24 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@21 -- # val= 00:09:12.860 12:03:25 -- accel/accel.sh@22 -- # case "$var" in 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # IFS=: 00:09:12.860 12:03:25 -- accel/accel.sh@20 -- # read -r var val 00:09:12.860 12:03:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:12.860 12:03:25 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:12.860 12:03:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:12.860 00:09:12.860 real 0m2.788s 00:09:12.860 user 0m2.426s 00:09:12.861 sys 0m0.367s 00:09:12.861 12:03:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.861 12:03:25 -- common/autotest_common.sh@10 -- # set +x 00:09:12.861 ************************************ 00:09:12.861 END TEST accel_decomp_mthread 00:09:12.861 ************************************ 00:09:13.119 12:03:25 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.119 12:03:25 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:09:13.119 12:03:25 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:13.119 12:03:25 -- common/autotest_common.sh@10 -- # set +x 00:09:13.119 ************************************ 00:09:13.119 START TEST accel_deomp_full_mthread 00:09:13.119 ************************************ 00:09:13.119 12:03:25 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.119 12:03:25 -- accel/accel.sh@16 -- # local accel_opc 00:09:13.119 12:03:25 -- accel/accel.sh@17 -- # local accel_module 00:09:13.119 12:03:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.119 12:03:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:13.119 12:03:25 -- accel/accel.sh@12 -- # build_accel_config 00:09:13.119 12:03:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:13.119 12:03:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:13.119 12:03:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:13.119 12:03:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:13.119 12:03:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:13.119 12:03:25 -- accel/accel.sh@41 -- # local IFS=, 00:09:13.119 12:03:25 -- accel/accel.sh@42 -- # jq -r . 00:09:13.119 [2024-06-11 12:03:25.938412] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:13.119 [2024-06-11 12:03:25.938462] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2689756 ] 00:09:13.119 EAL: No free 2048 kB hugepages reported on node 1 00:09:13.119 [2024-06-11 12:03:26.038533] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.119 [2024-06-11 12:03:26.083071] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.657 12:03:27 -- accel/accel.sh@18 -- # out='Preparing input file... 00:09:14.657 00:09:14.657 SPDK Configuration: 00:09:14.657 Core mask: 0x1 00:09:14.657 00:09:14.657 Accel Perf Configuration: 00:09:14.657 Workload Type: decompress 00:09:14.658 Transfer size: 111250 bytes 00:09:14.658 Vector count 1 00:09:14.658 Module: software 00:09:14.658 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:14.658 Queue depth: 32 00:09:14.658 Allocate depth: 32 00:09:14.658 # threads/core: 2 00:09:14.658 Run time: 1 seconds 00:09:14.658 Verify: Yes 00:09:14.658 00:09:14.658 Running for 1 seconds... 00:09:14.658 00:09:14.658 Core,Thread Transfers Bandwidth Failed Miscompares 00:09:14.658 ------------------------------------------------------------------------------------ 00:09:14.658 0,1 1984/s 81 MiB/s 0 0 00:09:14.658 0,0 1952/s 80 MiB/s 0 0 00:09:14.658 ==================================================================================== 00:09:14.658 Total 3936/s 417 MiB/s 0 0' 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:14.658 12:03:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:14.658 12:03:27 -- accel/accel.sh@12 -- # build_accel_config 00:09:14.658 12:03:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:14.658 12:03:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:14.658 12:03:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:14.658 12:03:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:14.658 12:03:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:14.658 12:03:27 -- accel/accel.sh@41 -- # local IFS=, 00:09:14.658 12:03:27 -- accel/accel.sh@42 -- # jq -r . 00:09:14.658 [2024-06-11 12:03:27.328892] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:14.658 [2024-06-11 12:03:27.328976] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2689937 ] 00:09:14.658 EAL: No free 2048 kB hugepages reported on node 1 00:09:14.658 [2024-06-11 12:03:27.447086] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.658 [2024-06-11 12:03:27.490712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=0x1 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=decompress 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@24 -- # accel_opc=decompress 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val='111250 bytes' 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=software 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@23 -- # accel_module=software 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=32 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=32 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=2 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val=Yes 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:14.658 12:03:27 -- accel/accel.sh@21 -- # val= 00:09:14.658 12:03:27 -- accel/accel.sh@22 -- # case "$var" in 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # IFS=: 00:09:14.658 12:03:27 -- accel/accel.sh@20 -- # read -r var val 00:09:16.036 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.036 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.036 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.036 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.036 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.036 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.036 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.037 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.037 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.037 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.037 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.037 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.037 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.037 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.037 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.037 12:03:28 -- accel/accel.sh@21 -- # val= 00:09:16.037 12:03:28 -- accel/accel.sh@22 -- # case "$var" in 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # IFS=: 00:09:16.037 12:03:28 -- accel/accel.sh@20 -- # read -r var val 00:09:16.037 12:03:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:09:16.037 12:03:28 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:09:16.037 12:03:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:16.037 00:09:16.037 real 0m2.785s 00:09:16.037 user 0m2.478s 00:09:16.037 sys 0m0.311s 00:09:16.037 12:03:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.037 12:03:28 -- common/autotest_common.sh@10 -- # set +x 00:09:16.037 ************************************ 00:09:16.037 END TEST accel_deomp_full_mthread 00:09:16.037 ************************************ 00:09:16.037 12:03:28 -- accel/accel.sh@116 -- # [[ n == y ]] 00:09:16.037 12:03:28 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:16.037 12:03:28 -- accel/accel.sh@129 -- # build_accel_config 00:09:16.037 12:03:28 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:09:16.037 12:03:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:16.037 12:03:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:09:16.037 12:03:28 -- common/autotest_common.sh@10 -- # set +x 00:09:16.037 12:03:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:16.037 12:03:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:16.037 12:03:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:09:16.037 12:03:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:09:16.037 12:03:28 -- accel/accel.sh@41 -- # local IFS=, 00:09:16.037 12:03:28 -- accel/accel.sh@42 -- # jq -r . 00:09:16.037 ************************************ 00:09:16.037 START TEST accel_dif_functional_tests 00:09:16.037 ************************************ 00:09:16.037 12:03:28 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:16.037 [2024-06-11 12:03:28.780932] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:16.037 [2024-06-11 12:03:28.781012] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2690141 ] 00:09:16.037 EAL: No free 2048 kB hugepages reported on node 1 00:09:16.037 [2024-06-11 12:03:28.899709] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:16.037 [2024-06-11 12:03:28.948051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:09:16.037 [2024-06-11 12:03:28.948135] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:09:16.037 [2024-06-11 12:03:28.948139] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.037 00:09:16.037 00:09:16.037 CUnit - A unit testing framework for C - Version 2.1-3 00:09:16.037 http://cunit.sourceforge.net/ 00:09:16.037 00:09:16.037 00:09:16.037 Suite: accel_dif 00:09:16.037 Test: verify: DIF generated, GUARD check ...passed 00:09:16.037 Test: verify: DIF generated, APPTAG check ...passed 00:09:16.037 Test: verify: DIF generated, REFTAG check ...passed 00:09:16.037 Test: verify: DIF not generated, GUARD check ...[2024-06-11 12:03:29.027006] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:16.037 [2024-06-11 12:03:29.027066] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:16.037 passed 00:09:16.037 Test: verify: DIF not generated, APPTAG check ...[2024-06-11 12:03:29.027112] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:16.037 [2024-06-11 12:03:29.027139] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:16.037 passed 00:09:16.037 Test: verify: DIF not generated, REFTAG check ...[2024-06-11 12:03:29.027168] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:16.037 [2024-06-11 12:03:29.027194] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:16.037 passed 00:09:16.037 Test: verify: APPTAG correct, APPTAG check ...passed 00:09:16.037 Test: verify: APPTAG incorrect, APPTAG check ...[2024-06-11 12:03:29.027254] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:09:16.037 passed 00:09:16.037 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:09:16.037 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:09:16.037 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:09:16.037 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-06-11 12:03:29.027392] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:09:16.037 passed 00:09:16.037 Test: generate copy: DIF generated, GUARD check ...passed 00:09:16.037 Test: generate copy: DIF generated, APTTAG check ...passed 00:09:16.037 Test: generate copy: DIF generated, REFTAG check ...passed 00:09:16.037 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:09:16.037 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:09:16.037 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:09:16.037 Test: generate copy: iovecs-len validate ...[2024-06-11 12:03:29.027642] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:09:16.037 passed 00:09:16.037 Test: generate copy: buffer alignment validate ...passed 00:09:16.037 00:09:16.037 Run Summary: Type Total Ran Passed Failed Inactive 00:09:16.037 suites 1 1 n/a 0 0 00:09:16.037 tests 20 20 20 0 0 00:09:16.037 asserts 204 204 204 0 n/a 00:09:16.037 00:09:16.037 Elapsed time = 0.003 seconds 00:09:16.296 00:09:16.296 real 0m0.458s 00:09:16.296 user 0m0.660s 00:09:16.296 sys 0m0.211s 00:09:16.296 12:03:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.296 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.296 ************************************ 00:09:16.296 END TEST accel_dif_functional_tests 00:09:16.296 ************************************ 00:09:16.296 00:09:16.296 real 0m59.323s 00:09:16.296 user 1m5.034s 00:09:16.296 sys 0m9.349s 00:09:16.296 12:03:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.296 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.296 ************************************ 00:09:16.296 END TEST accel 00:09:16.296 ************************************ 00:09:16.296 12:03:29 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:16.296 12:03:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:16.296 12:03:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:16.296 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.296 ************************************ 00:09:16.296 START TEST accel_rpc 00:09:16.296 ************************************ 00:09:16.296 12:03:29 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:16.555 * Looking for test storage... 00:09:16.555 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:09:16.555 12:03:29 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:16.555 12:03:29 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2690362 00:09:16.555 12:03:29 -- accel/accel_rpc.sh@15 -- # waitforlisten 2690362 00:09:16.555 12:03:29 -- common/autotest_common.sh@819 -- # '[' -z 2690362 ']' 00:09:16.555 12:03:29 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:16.555 12:03:29 -- common/autotest_common.sh@824 -- # local max_retries=100 00:09:16.555 12:03:29 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:16.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:16.555 12:03:29 -- common/autotest_common.sh@828 -- # xtrace_disable 00:09:16.555 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.555 12:03:29 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:09:16.555 [2024-06-11 12:03:29.434095] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:16.555 [2024-06-11 12:03:29.434166] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2690362 ] 00:09:16.555 EAL: No free 2048 kB hugepages reported on node 1 00:09:16.555 [2024-06-11 12:03:29.551119] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.814 [2024-06-11 12:03:29.596288] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:16.814 [2024-06-11 12:03:29.596447] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.814 12:03:29 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:09:16.814 12:03:29 -- common/autotest_common.sh@852 -- # return 0 00:09:16.814 12:03:29 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:09:16.814 12:03:29 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:09:16.815 12:03:29 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:09:16.815 12:03:29 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:09:16.815 12:03:29 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:09:16.815 12:03:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:16.815 12:03:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:16.815 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.815 ************************************ 00:09:16.815 START TEST accel_assign_opcode 00:09:16.815 ************************************ 00:09:16.815 12:03:29 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:09:16.815 12:03:29 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:09:16.815 12:03:29 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:16.815 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.815 [2024-06-11 12:03:29.661018] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:09:16.815 12:03:29 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:16.815 12:03:29 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:09:16.815 12:03:29 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:16.815 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:16.815 [2024-06-11 12:03:29.669029] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:09:16.815 12:03:29 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:16.815 12:03:29 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:09:16.815 12:03:29 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:16.815 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:17.074 12:03:29 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:17.074 12:03:29 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:09:17.074 12:03:29 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:09:17.074 12:03:29 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:17.074 12:03:29 -- accel/accel_rpc.sh@42 -- # grep software 00:09:17.074 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:17.074 12:03:29 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:17.074 software 00:09:17.074 00:09:17.074 real 0m0.260s 00:09:17.074 user 0m0.043s 00:09:17.074 sys 0m0.014s 00:09:17.074 12:03:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.074 12:03:29 -- common/autotest_common.sh@10 -- # set +x 00:09:17.074 ************************************ 00:09:17.074 END TEST accel_assign_opcode 00:09:17.074 ************************************ 00:09:17.074 12:03:29 -- accel/accel_rpc.sh@55 -- # killprocess 2690362 00:09:17.074 12:03:29 -- common/autotest_common.sh@926 -- # '[' -z 2690362 ']' 00:09:17.074 12:03:29 -- common/autotest_common.sh@930 -- # kill -0 2690362 00:09:17.074 12:03:29 -- common/autotest_common.sh@931 -- # uname 00:09:17.074 12:03:29 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:09:17.074 12:03:29 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2690362 00:09:17.074 12:03:29 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:09:17.074 12:03:30 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:09:17.074 12:03:30 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2690362' 00:09:17.074 killing process with pid 2690362 00:09:17.074 12:03:30 -- common/autotest_common.sh@945 -- # kill 2690362 00:09:17.074 12:03:30 -- common/autotest_common.sh@950 -- # wait 2690362 00:09:17.332 00:09:17.332 real 0m1.012s 00:09:17.332 user 0m0.884s 00:09:17.332 sys 0m0.510s 00:09:17.332 12:03:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.332 12:03:30 -- common/autotest_common.sh@10 -- # set +x 00:09:17.332 ************************************ 00:09:17.332 END TEST accel_rpc 00:09:17.332 ************************************ 00:09:17.332 12:03:30 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:09:17.332 12:03:30 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:17.332 12:03:30 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:17.332 12:03:30 -- common/autotest_common.sh@10 -- # set +x 00:09:17.591 ************************************ 00:09:17.591 START TEST app_cmdline 00:09:17.591 ************************************ 00:09:17.591 12:03:30 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:09:17.591 * Looking for test storage... 00:09:17.591 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:17.591 12:03:30 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:09:17.591 12:03:30 -- app/cmdline.sh@17 -- # spdk_tgt_pid=2690549 00:09:17.591 12:03:30 -- app/cmdline.sh@18 -- # waitforlisten 2690549 00:09:17.591 12:03:30 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:09:17.591 12:03:30 -- common/autotest_common.sh@819 -- # '[' -z 2690549 ']' 00:09:17.591 12:03:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:17.591 12:03:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:09:17.591 12:03:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:17.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:17.591 12:03:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:09:17.591 12:03:30 -- common/autotest_common.sh@10 -- # set +x 00:09:17.591 [2024-06-11 12:03:30.497594] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:17.591 [2024-06-11 12:03:30.497676] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2690549 ] 00:09:17.591 EAL: No free 2048 kB hugepages reported on node 1 00:09:17.591 [2024-06-11 12:03:30.619147] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.850 [2024-06-11 12:03:30.664121] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:17.850 [2024-06-11 12:03:30.664275] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.417 12:03:31 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:09:18.417 12:03:31 -- common/autotest_common.sh@852 -- # return 0 00:09:18.675 12:03:31 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:09:18.675 { 00:09:18.675 "version": "SPDK v24.01.1-pre git sha1 130b9406a", 00:09:18.675 "fields": { 00:09:18.675 "major": 24, 00:09:18.675 "minor": 1, 00:09:18.675 "patch": 1, 00:09:18.675 "suffix": "-pre", 00:09:18.675 "commit": "130b9406a" 00:09:18.675 } 00:09:18.675 } 00:09:18.675 12:03:31 -- app/cmdline.sh@22 -- # expected_methods=() 00:09:18.675 12:03:31 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:09:18.675 12:03:31 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:09:18.675 12:03:31 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:09:18.675 12:03:31 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:09:18.675 12:03:31 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:09:18.675 12:03:31 -- common/autotest_common.sh@551 -- # xtrace_disable 00:09:18.675 12:03:31 -- common/autotest_common.sh@10 -- # set +x 00:09:18.675 12:03:31 -- app/cmdline.sh@26 -- # sort 00:09:18.675 12:03:31 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:09:18.934 12:03:31 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:09:18.934 12:03:31 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:09:18.934 12:03:31 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:18.934 12:03:31 -- common/autotest_common.sh@640 -- # local es=0 00:09:18.934 12:03:31 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:18.934 12:03:31 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.934 12:03:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:09:18.934 12:03:31 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.934 12:03:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:09:18.934 12:03:31 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.934 12:03:31 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:09:18.934 12:03:31 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:09:18.934 12:03:31 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:09:18.934 12:03:31 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:18.934 request: 00:09:18.934 { 00:09:18.934 "method": "env_dpdk_get_mem_stats", 00:09:18.934 "req_id": 1 00:09:18.934 } 00:09:18.934 Got JSON-RPC error response 00:09:18.934 response: 00:09:18.934 { 00:09:18.934 "code": -32601, 00:09:18.934 "message": "Method not found" 00:09:18.934 } 00:09:19.193 12:03:31 -- common/autotest_common.sh@643 -- # es=1 00:09:19.193 12:03:31 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:09:19.193 12:03:31 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:09:19.193 12:03:31 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:09:19.193 12:03:31 -- app/cmdline.sh@1 -- # killprocess 2690549 00:09:19.193 12:03:31 -- common/autotest_common.sh@926 -- # '[' -z 2690549 ']' 00:09:19.193 12:03:31 -- common/autotest_common.sh@930 -- # kill -0 2690549 00:09:19.193 12:03:31 -- common/autotest_common.sh@931 -- # uname 00:09:19.193 12:03:31 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:09:19.193 12:03:31 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 2690549 00:09:19.193 12:03:32 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:09:19.193 12:03:32 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:09:19.193 12:03:32 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 2690549' 00:09:19.193 killing process with pid 2690549 00:09:19.193 12:03:32 -- common/autotest_common.sh@945 -- # kill 2690549 00:09:19.193 12:03:32 -- common/autotest_common.sh@950 -- # wait 2690549 00:09:19.452 00:09:19.452 real 0m1.992s 00:09:19.452 user 0m2.410s 00:09:19.452 sys 0m0.603s 00:09:19.452 12:03:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.452 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.452 ************************************ 00:09:19.452 END TEST app_cmdline 00:09:19.452 ************************************ 00:09:19.452 12:03:32 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:09:19.452 12:03:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:19.452 12:03:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:19.452 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.452 ************************************ 00:09:19.452 START TEST version 00:09:19.452 ************************************ 00:09:19.452 12:03:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:09:19.711 * Looking for test storage... 00:09:19.711 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:19.711 12:03:32 -- app/version.sh@17 -- # get_header_version major 00:09:19.711 12:03:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.711 12:03:32 -- app/version.sh@14 -- # tr -d '"' 00:09:19.711 12:03:32 -- app/version.sh@14 -- # cut -f2 00:09:19.711 12:03:32 -- app/version.sh@17 -- # major=24 00:09:19.711 12:03:32 -- app/version.sh@18 -- # get_header_version minor 00:09:19.711 12:03:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.711 12:03:32 -- app/version.sh@14 -- # tr -d '"' 00:09:19.711 12:03:32 -- app/version.sh@14 -- # cut -f2 00:09:19.711 12:03:32 -- app/version.sh@18 -- # minor=1 00:09:19.711 12:03:32 -- app/version.sh@19 -- # get_header_version patch 00:09:19.711 12:03:32 -- app/version.sh@14 -- # tr -d '"' 00:09:19.711 12:03:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.711 12:03:32 -- app/version.sh@14 -- # cut -f2 00:09:19.711 12:03:32 -- app/version.sh@19 -- # patch=1 00:09:19.711 12:03:32 -- app/version.sh@20 -- # get_header_version suffix 00:09:19.711 12:03:32 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:09:19.711 12:03:32 -- app/version.sh@14 -- # cut -f2 00:09:19.711 12:03:32 -- app/version.sh@14 -- # tr -d '"' 00:09:19.711 12:03:32 -- app/version.sh@20 -- # suffix=-pre 00:09:19.711 12:03:32 -- app/version.sh@22 -- # version=24.1 00:09:19.711 12:03:32 -- app/version.sh@25 -- # (( patch != 0 )) 00:09:19.711 12:03:32 -- app/version.sh@25 -- # version=24.1.1 00:09:19.711 12:03:32 -- app/version.sh@28 -- # version=24.1.1rc0 00:09:19.711 12:03:32 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:19.711 12:03:32 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:19.711 12:03:32 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:09:19.711 12:03:32 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:09:19.711 00:09:19.711 real 0m0.190s 00:09:19.711 user 0m0.093s 00:09:19.711 sys 0m0.139s 00:09:19.711 12:03:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.711 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.711 ************************************ 00:09:19.711 END TEST version 00:09:19.711 ************************************ 00:09:19.711 12:03:32 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@204 -- # uname -s 00:09:19.711 12:03:32 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:09:19.711 12:03:32 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:09:19.711 12:03:32 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:09:19.711 12:03:32 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@268 -- # timing_exit lib 00:09:19.711 12:03:32 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:19.711 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.711 12:03:32 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:09:19.711 12:03:32 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:09:19.711 12:03:32 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:09:19.711 12:03:32 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:09:19.711 12:03:32 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:09:19.711 12:03:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:19.711 12:03:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:19.711 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.711 ************************************ 00:09:19.711 START TEST llvm_fuzz 00:09:19.711 ************************************ 00:09:19.711 12:03:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:09:19.971 * Looking for test storage... 00:09:19.971 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:09:19.971 12:03:32 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:09:19.971 12:03:32 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:09:19.971 12:03:32 -- common/autotest_common.sh@538 -- # fuzzers=() 00:09:19.971 12:03:32 -- common/autotest_common.sh@538 -- # local fuzzers 00:09:19.971 12:03:32 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:09:19.971 12:03:32 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:09:19.971 12:03:32 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:09:19.971 12:03:32 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:09:19.971 12:03:32 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:09:19.971 12:03:32 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:09:19.971 12:03:32 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:09:19.971 12:03:32 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:19.971 12:03:32 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:19.971 12:03:32 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:19.971 12:03:32 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:19.971 12:03:32 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:19.971 12:03:32 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:19.971 12:03:32 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:09:19.971 12:03:32 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:09:19.971 12:03:32 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:09:19.971 12:03:32 -- common/autotest_common.sh@10 -- # set +x 00:09:19.971 ************************************ 00:09:19.971 START TEST nvmf_fuzz 00:09:19.971 ************************************ 00:09:19.971 12:03:32 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:09:19.971 * Looking for test storage... 00:09:19.971 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:19.971 12:03:32 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:19.971 12:03:32 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:19.971 12:03:32 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:19.971 12:03:32 -- common/autotest_common.sh@34 -- # set -e 00:09:19.971 12:03:32 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:19.972 12:03:32 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:19.972 12:03:32 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:19.972 12:03:32 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:19.972 12:03:32 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:19.972 12:03:32 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:19.972 12:03:32 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:19.972 12:03:32 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:19.972 12:03:32 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:19.972 12:03:32 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:19.972 12:03:32 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:19.972 12:03:32 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:19.972 12:03:32 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:19.972 12:03:32 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:19.972 12:03:32 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:19.972 12:03:32 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:19.972 12:03:32 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:19.972 12:03:32 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:19.972 12:03:32 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:19.972 12:03:32 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:19.972 12:03:32 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:09:19.972 12:03:32 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:19.972 12:03:32 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:19.972 12:03:32 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:09:19.972 12:03:32 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:09:19.972 12:03:32 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:09:19.972 12:03:32 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:19.972 12:03:32 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:09:19.972 12:03:32 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:09:19.972 12:03:32 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:19.972 12:03:32 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:19.972 12:03:32 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:09:19.972 12:03:32 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:09:19.972 12:03:32 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:09:19.972 12:03:32 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:09:19.972 12:03:32 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:09:19.972 12:03:32 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:09:19.972 12:03:32 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:09:19.972 12:03:32 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:09:19.972 12:03:32 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:19.972 12:03:32 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:09:19.972 12:03:32 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:09:19.972 12:03:32 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:09:19.972 12:03:32 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:09:19.972 12:03:32 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:19.972 12:03:32 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:09:19.972 12:03:32 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:09:19.972 12:03:32 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:19.972 12:03:32 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:09:19.972 12:03:32 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:09:19.972 12:03:32 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:09:19.972 12:03:32 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:19.972 12:03:32 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:09:19.972 12:03:32 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:09:19.972 12:03:32 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:09:19.972 12:03:32 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:09:19.972 12:03:32 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:09:19.972 12:03:32 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:09:19.972 12:03:32 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:09:19.972 12:03:32 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:09:19.972 12:03:32 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:09:19.972 12:03:32 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:09:19.972 12:03:32 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:09:19.972 12:03:32 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:09:19.972 12:03:32 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:19.972 12:03:32 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:09:19.972 12:03:32 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:09:19.972 12:03:32 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:09:19.972 12:03:32 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:09:19.972 12:03:32 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:19.972 12:03:32 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:09:19.972 12:03:32 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:09:19.972 12:03:32 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:09:19.972 12:03:32 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:09:19.972 12:03:32 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:09:19.972 12:03:32 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:09:19.972 12:03:32 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:09:19.972 12:03:32 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:09:19.972 12:03:32 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:09:19.972 12:03:32 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:09:19.972 12:03:32 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:19.972 12:03:32 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:09:19.972 12:03:32 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:09:19.972 12:03:32 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:19.972 12:03:32 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:19.972 12:03:32 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:19.972 12:03:32 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:19.972 12:03:32 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:19.972 12:03:32 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:19.972 12:03:32 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:19.972 12:03:32 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:19.972 12:03:32 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:19.972 12:03:32 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:19.972 12:03:32 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:19.972 12:03:32 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:19.972 12:03:32 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:19.972 12:03:32 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:19.972 12:03:32 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:19.972 12:03:32 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:19.972 #define SPDK_CONFIG_H 00:09:19.972 #define SPDK_CONFIG_APPS 1 00:09:19.972 #define SPDK_CONFIG_ARCH native 00:09:19.972 #undef SPDK_CONFIG_ASAN 00:09:19.972 #undef SPDK_CONFIG_AVAHI 00:09:19.972 #undef SPDK_CONFIG_CET 00:09:19.972 #define SPDK_CONFIG_COVERAGE 1 00:09:19.972 #define SPDK_CONFIG_CROSS_PREFIX 00:09:19.972 #undef SPDK_CONFIG_CRYPTO 00:09:19.972 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:19.972 #undef SPDK_CONFIG_CUSTOMOCF 00:09:19.972 #undef SPDK_CONFIG_DAOS 00:09:19.972 #define SPDK_CONFIG_DAOS_DIR 00:09:19.972 #define SPDK_CONFIG_DEBUG 1 00:09:19.972 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:19.972 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:19.972 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:09:19.972 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:19.972 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:19.972 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:19.972 #define SPDK_CONFIG_EXAMPLES 1 00:09:19.972 #undef SPDK_CONFIG_FC 00:09:19.972 #define SPDK_CONFIG_FC_PATH 00:09:19.972 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:19.972 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:19.972 #undef SPDK_CONFIG_FUSE 00:09:19.972 #define SPDK_CONFIG_FUZZER 1 00:09:19.972 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:09:19.972 #undef SPDK_CONFIG_GOLANG 00:09:19.972 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:19.972 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:19.972 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:19.972 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:19.972 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:19.972 #define SPDK_CONFIG_IDXD 1 00:09:19.972 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:19.972 #undef SPDK_CONFIG_IPSEC_MB 00:09:19.972 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:19.972 #define SPDK_CONFIG_ISAL 1 00:09:19.972 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:19.972 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:19.972 #define SPDK_CONFIG_LIBDIR 00:09:19.972 #undef SPDK_CONFIG_LTO 00:09:19.972 #define SPDK_CONFIG_MAX_LCORES 00:09:19.972 #define SPDK_CONFIG_NVME_CUSE 1 00:09:19.972 #undef SPDK_CONFIG_OCF 00:09:19.972 #define SPDK_CONFIG_OCF_PATH 00:09:19.972 #define SPDK_CONFIG_OPENSSL_PATH 00:09:19.972 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:19.972 #undef SPDK_CONFIG_PGO_USE 00:09:19.972 #define SPDK_CONFIG_PREFIX /usr/local 00:09:19.972 #undef SPDK_CONFIG_RAID5F 00:09:19.972 #undef SPDK_CONFIG_RBD 00:09:19.972 #define SPDK_CONFIG_RDMA 1 00:09:19.972 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:19.972 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:19.972 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:19.972 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:19.972 #undef SPDK_CONFIG_SHARED 00:09:19.972 #undef SPDK_CONFIG_SMA 00:09:19.972 #define SPDK_CONFIG_TESTS 1 00:09:19.972 #undef SPDK_CONFIG_TSAN 00:09:19.972 #define SPDK_CONFIG_UBLK 1 00:09:19.972 #define SPDK_CONFIG_UBSAN 1 00:09:19.972 #undef SPDK_CONFIG_UNIT_TESTS 00:09:19.972 #undef SPDK_CONFIG_URING 00:09:19.973 #define SPDK_CONFIG_URING_PATH 00:09:19.973 #undef SPDK_CONFIG_URING_ZNS 00:09:19.973 #undef SPDK_CONFIG_USDT 00:09:19.973 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:19.973 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:19.973 #define SPDK_CONFIG_VFIO_USER 1 00:09:19.973 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:19.973 #define SPDK_CONFIG_VHOST 1 00:09:19.973 #define SPDK_CONFIG_VIRTIO 1 00:09:19.973 #undef SPDK_CONFIG_VTUNE 00:09:19.973 #define SPDK_CONFIG_VTUNE_DIR 00:09:19.973 #define SPDK_CONFIG_WERROR 1 00:09:19.973 #define SPDK_CONFIG_WPDK_DIR 00:09:19.973 #undef SPDK_CONFIG_XNVME 00:09:19.973 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:19.973 12:03:32 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:19.973 12:03:32 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:19.973 12:03:32 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:19.973 12:03:32 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:19.973 12:03:32 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:19.973 12:03:32 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:19.973 12:03:32 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:19.973 12:03:32 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:19.973 12:03:32 -- paths/export.sh@5 -- # export PATH 00:09:19.973 12:03:32 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:19.973 12:03:32 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:19.973 12:03:32 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:19.973 12:03:32 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:19.973 12:03:32 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:19.973 12:03:32 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:19.973 12:03:32 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:19.973 12:03:32 -- pm/common@16 -- # TEST_TAG=N/A 00:09:19.973 12:03:32 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:19.973 12:03:32 -- common/autotest_common.sh@52 -- # : 1 00:09:19.973 12:03:32 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:09:19.973 12:03:32 -- common/autotest_common.sh@56 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:19.973 12:03:32 -- common/autotest_common.sh@58 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:09:19.973 12:03:32 -- common/autotest_common.sh@60 -- # : 1 00:09:19.973 12:03:32 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:19.973 12:03:32 -- common/autotest_common.sh@62 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:09:19.973 12:03:32 -- common/autotest_common.sh@64 -- # : 00:09:19.973 12:03:32 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:09:19.973 12:03:32 -- common/autotest_common.sh@66 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:09:19.973 12:03:32 -- common/autotest_common.sh@68 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:09:19.973 12:03:32 -- common/autotest_common.sh@70 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:09:19.973 12:03:32 -- common/autotest_common.sh@72 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:19.973 12:03:32 -- common/autotest_common.sh@74 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:09:19.973 12:03:32 -- common/autotest_common.sh@76 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:09:19.973 12:03:32 -- common/autotest_common.sh@78 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:09:19.973 12:03:32 -- common/autotest_common.sh@80 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:09:19.973 12:03:32 -- common/autotest_common.sh@82 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:09:19.973 12:03:32 -- common/autotest_common.sh@84 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:09:19.973 12:03:32 -- common/autotest_common.sh@86 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:09:19.973 12:03:32 -- common/autotest_common.sh@88 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:09:19.973 12:03:32 -- common/autotest_common.sh@90 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:19.973 12:03:32 -- common/autotest_common.sh@92 -- # : 1 00:09:19.973 12:03:32 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:09:19.973 12:03:32 -- common/autotest_common.sh@94 -- # : 1 00:09:19.973 12:03:32 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:09:19.973 12:03:32 -- common/autotest_common.sh@96 -- # : rdma 00:09:19.973 12:03:32 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:19.973 12:03:32 -- common/autotest_common.sh@98 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:09:19.973 12:03:32 -- common/autotest_common.sh@100 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:09:19.973 12:03:32 -- common/autotest_common.sh@102 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:09:19.973 12:03:32 -- common/autotest_common.sh@104 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:09:19.973 12:03:32 -- common/autotest_common.sh@106 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:09:19.973 12:03:32 -- common/autotest_common.sh@108 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:09:19.973 12:03:32 -- common/autotest_common.sh@110 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:09:19.973 12:03:32 -- common/autotest_common.sh@112 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:19.973 12:03:32 -- common/autotest_common.sh@114 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:09:19.973 12:03:32 -- common/autotest_common.sh@116 -- # : 1 00:09:19.973 12:03:32 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:09:19.973 12:03:32 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:19.973 12:03:32 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:19.973 12:03:32 -- common/autotest_common.sh@120 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:09:19.973 12:03:32 -- common/autotest_common.sh@122 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:09:19.973 12:03:32 -- common/autotest_common.sh@124 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:09:19.973 12:03:32 -- common/autotest_common.sh@126 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:09:19.973 12:03:32 -- common/autotest_common.sh@128 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:09:19.973 12:03:32 -- common/autotest_common.sh@130 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:09:19.973 12:03:32 -- common/autotest_common.sh@132 -- # : v23.11 00:09:19.973 12:03:32 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:09:19.973 12:03:32 -- common/autotest_common.sh@134 -- # : true 00:09:19.973 12:03:32 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:09:19.973 12:03:32 -- common/autotest_common.sh@136 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:09:19.973 12:03:32 -- common/autotest_common.sh@138 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:09:19.973 12:03:32 -- common/autotest_common.sh@140 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:09:19.973 12:03:32 -- common/autotest_common.sh@142 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:09:19.973 12:03:32 -- common/autotest_common.sh@144 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:09:19.973 12:03:32 -- common/autotest_common.sh@146 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:09:19.973 12:03:32 -- common/autotest_common.sh@148 -- # : 00:09:19.973 12:03:32 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:09:19.973 12:03:32 -- common/autotest_common.sh@150 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:09:19.973 12:03:32 -- common/autotest_common.sh@152 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:09:19.973 12:03:32 -- common/autotest_common.sh@154 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:09:19.973 12:03:32 -- common/autotest_common.sh@156 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:09:19.973 12:03:32 -- common/autotest_common.sh@158 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:09:19.973 12:03:32 -- common/autotest_common.sh@160 -- # : 0 00:09:19.973 12:03:32 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:09:19.974 12:03:32 -- common/autotest_common.sh@163 -- # : 00:09:19.974 12:03:32 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:09:19.974 12:03:32 -- common/autotest_common.sh@165 -- # : 0 00:09:19.974 12:03:32 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:09:19.974 12:03:32 -- common/autotest_common.sh@167 -- # : 0 00:09:19.974 12:03:32 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:19.974 12:03:32 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:19.974 12:03:32 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:19.974 12:03:32 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:19.974 12:03:32 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:19.974 12:03:32 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:19.974 12:03:32 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:19.974 12:03:32 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:09:19.974 12:03:32 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:19.974 12:03:32 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:19.974 12:03:32 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:19.974 12:03:32 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:19.974 12:03:32 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:19.974 12:03:32 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:09:19.974 12:03:32 -- common/autotest_common.sh@196 -- # cat 00:09:19.974 12:03:32 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:09:19.974 12:03:32 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:19.974 12:03:32 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:19.974 12:03:32 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:19.974 12:03:32 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:19.974 12:03:32 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:09:19.974 12:03:32 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:09:19.974 12:03:32 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:19.974 12:03:32 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:19.974 12:03:32 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:19.974 12:03:32 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:19.974 12:03:32 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:19.974 12:03:32 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:19.974 12:03:32 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:19.974 12:03:32 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:19.974 12:03:32 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:19.974 12:03:32 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:19.974 12:03:32 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:19.974 12:03:32 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:19.974 12:03:32 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:09:19.974 12:03:32 -- common/autotest_common.sh@249 -- # export valgrind= 00:09:19.974 12:03:32 -- common/autotest_common.sh@249 -- # valgrind= 00:09:19.974 12:03:32 -- common/autotest_common.sh@255 -- # uname -s 00:09:19.974 12:03:32 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:09:19.974 12:03:32 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:09:19.974 12:03:32 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:09:19.974 12:03:32 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:09:19.974 12:03:32 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:09:19.974 12:03:32 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:09:19.974 12:03:32 -- common/autotest_common.sh@265 -- # MAKE=make 00:09:19.974 12:03:32 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j72 00:09:19.974 12:03:32 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:09:19.974 12:03:32 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:09:19.974 12:03:32 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:19.974 12:03:32 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:09:19.974 12:03:32 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:09:19.974 12:03:32 -- common/autotest_common.sh@309 -- # [[ -z 2690952 ]] 00:09:19.974 12:03:32 -- common/autotest_common.sh@309 -- # kill -0 2690952 00:09:19.974 12:03:32 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:09:19.974 12:03:32 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:09:19.974 12:03:32 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:09:19.974 12:03:32 -- common/autotest_common.sh@322 -- # local mount target_dir 00:09:19.974 12:03:32 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:09:19.974 12:03:32 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:09:19.974 12:03:32 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:09:19.974 12:03:32 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:09:20.234 12:03:33 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.cYw9a0 00:09:20.234 12:03:33 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:20.234 12:03:33 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:09:20.234 12:03:33 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:09:20.234 12:03:33 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.cYw9a0/tests/nvmf /tmp/spdk.cYw9a0 00:09:20.234 12:03:33 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@318 -- # df -T 00:09:20.234 12:03:33 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=902909952 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=4381519872 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=80796073984 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=94508556288 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=13712482304 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=47251685376 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254278144 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=18895622144 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=18901712896 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=6090752 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=47253745664 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254278144 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=532480 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # avails["$mount"]=9450848256 00:09:20.234 12:03:33 -- common/autotest_common.sh@353 -- # sizes["$mount"]=9450852352 00:09:20.234 12:03:33 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:09:20.234 12:03:33 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:09:20.234 12:03:33 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:09:20.234 * Looking for test storage... 00:09:20.234 12:03:33 -- common/autotest_common.sh@359 -- # local target_space new_size 00:09:20.234 12:03:33 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:09:20.234 12:03:33 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.234 12:03:33 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:20.234 12:03:33 -- common/autotest_common.sh@363 -- # mount=/ 00:09:20.234 12:03:33 -- common/autotest_common.sh@365 -- # target_space=80796073984 00:09:20.234 12:03:33 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:09:20.234 12:03:33 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:09:20.234 12:03:33 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:09:20.234 12:03:33 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:09:20.234 12:03:33 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:09:20.234 12:03:33 -- common/autotest_common.sh@372 -- # new_size=15927074816 00:09:20.234 12:03:33 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:20.234 12:03:33 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.234 12:03:33 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.234 12:03:33 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.234 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:09:20.234 12:03:33 -- common/autotest_common.sh@380 -- # return 0 00:09:20.234 12:03:33 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:09:20.234 12:03:33 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:09:20.234 12:03:33 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:20.234 12:03:33 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:20.234 12:03:33 -- common/autotest_common.sh@1672 -- # true 00:09:20.234 12:03:33 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:09:20.234 12:03:33 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:20.234 12:03:33 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:20.235 12:03:33 -- common/autotest_common.sh@27 -- # exec 00:09:20.235 12:03:33 -- common/autotest_common.sh@29 -- # exec 00:09:20.235 12:03:33 -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:20.235 12:03:33 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:20.235 12:03:33 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:20.235 12:03:33 -- common/autotest_common.sh@18 -- # set -x 00:09:20.235 12:03:33 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:09:20.235 12:03:33 -- ../common.sh@8 -- # pids=() 00:09:20.235 12:03:33 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:09:20.235 12:03:33 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:09:20.235 12:03:33 -- nvmf/run.sh@56 -- # fuzz_num=25 00:09:20.235 12:03:33 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:09:20.235 12:03:33 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:09:20.235 12:03:33 -- nvmf/run.sh@61 -- # mem_size=512 00:09:20.235 12:03:33 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:09:20.235 12:03:33 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:09:20.235 12:03:33 -- ../common.sh@69 -- # local fuzz_num=25 00:09:20.235 12:03:33 -- ../common.sh@70 -- # local time=1 00:09:20.235 12:03:33 -- ../common.sh@72 -- # (( i = 0 )) 00:09:20.235 12:03:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:20.235 12:03:33 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:20.235 12:03:33 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:09:20.235 12:03:33 -- nvmf/run.sh@24 -- # local timen=1 00:09:20.235 12:03:33 -- nvmf/run.sh@25 -- # local core=0x1 00:09:20.235 12:03:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:09:20.235 12:03:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:09:20.235 12:03:33 -- nvmf/run.sh@29 -- # printf %02d 0 00:09:20.235 12:03:33 -- nvmf/run.sh@29 -- # port=4400 00:09:20.235 12:03:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:09:20.235 12:03:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:09:20.235 12:03:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:20.235 12:03:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:09:20.235 [2024-06-11 12:03:33.093890] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:20.235 [2024-06-11 12:03:33.093971] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2691083 ] 00:09:20.235 EAL: No free 2048 kB hugepages reported on node 1 00:09:20.493 [2024-06-11 12:03:33.464587] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.493 [2024-06-11 12:03:33.500497] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:20.493 [2024-06-11 12:03:33.500685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.752 [2024-06-11 12:03:33.555439] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:20.752 [2024-06-11 12:03:33.571685] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:09:20.752 INFO: Running with entropic power schedule (0xFF, 100). 00:09:20.752 INFO: Seed: 2540972088 00:09:20.752 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:20.752 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:20.752 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:09:20.752 INFO: A corpus is not provided, starting from an empty corpus 00:09:20.752 #2 INITED exec/s: 0 rss: 61Mb 00:09:20.752 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:20.752 This may also happen if the target rejected all inputs we tried so far 00:09:20.752 [2024-06-11 12:03:33.620933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.752 [2024-06-11 12:03:33.620964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 NEW_FUNC[1/663]: 0x49e700 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:09:21.319 NEW_FUNC[2/663]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:21.319 #13 NEW cov: 11462 ft: 11463 corp: 2/87b lim: 320 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:09:21.319 [2024-06-11 12:03:34.072330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:71717171 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7171717171717171 00:09:21.319 [2024-06-11 12:03:34.072371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 NEW_FUNC[1/1]: 0x17815a0 in nvme_tcp_qpair /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:171 00:09:21.319 #29 NEW cov: 11598 ft: 12401 corp: 3/174b lim: 320 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 CrossOver- 00:09:21.319 [2024-06-11 12:03:34.122138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0xbdbdbdbdbdbdbdbd 00:09:21.319 [2024-06-11 12:03:34.122168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 #34 NEW cov: 11604 ft: 12657 corp: 4/297b lim: 320 exec/s: 0 rss: 69Mb L: 123/123 MS: 5 CrossOver-CopyPart-ChangeByte-EraseBytes-InsertRepeatedBytes- 00:09:21.319 [2024-06-11 12:03:34.162309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.319 [2024-06-11 12:03:34.162337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 #35 NEW cov: 11689 ft: 12895 corp: 5/383b lim: 320 exec/s: 0 rss: 69Mb L: 86/123 MS: 1 ChangeBinInt- 00:09:21.319 [2024-06-11 12:03:34.202350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:24200000 cdw11:00000000 00:09:21.319 [2024-06-11 12:03:34.202381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 #39 NEW cov: 11691 ft: 12995 corp: 6/455b lim: 320 exec/s: 0 rss: 69Mb L: 72/123 MS: 4 InsertRepeatedBytes-ChangeByte-InsertByte-CopyPart- 00:09:21.319 [2024-06-11 12:03:34.242578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.319 [2024-06-11 12:03:34.242606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 #40 NEW cov: 11691 ft: 13094 corp: 7/541b lim: 320 exec/s: 0 rss: 69Mb L: 86/123 MS: 1 ChangeBit- 00:09:21.319 [2024-06-11 12:03:34.282612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:24200000 cdw11:00000000 00:09:21.319 [2024-06-11 12:03:34.282639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.319 #41 NEW cov: 11691 ft: 13168 corp: 8/613b lim: 320 exec/s: 0 rss: 69Mb L: 72/123 MS: 1 ChangeBinInt- 00:09:21.319 [2024-06-11 12:03:34.322761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.319 [2024-06-11 12:03:34.322793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.320 #42 NEW cov: 11691 ft: 13205 corp: 9/685b lim: 320 exec/s: 0 rss: 69Mb L: 72/123 MS: 1 EraseBytes- 00:09:21.578 [2024-06-11 12:03:34.362885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.578 [2024-06-11 12:03:34.362914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.578 #43 NEW cov: 11691 ft: 13232 corp: 10/808b lim: 320 exec/s: 0 rss: 69Mb L: 123/123 MS: 1 CopyPart- 00:09:21.578 [2024-06-11 12:03:34.403022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:20aadf3 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.578 [2024-06-11 12:03:34.403051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.578 #44 NEW cov: 11691 ft: 13285 corp: 11/894b lim: 320 exec/s: 0 rss: 69Mb L: 86/123 MS: 1 CMP- DE: "\363\255\012\002\000\000\000\000"- 00:09:21.578 [2024-06-11 12:03:34.443113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0xbdbdbdbdbdbdbdbd 00:09:21.578 [2024-06-11 12:03:34.443143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.578 #45 NEW cov: 11691 ft: 13313 corp: 12/1017b lim: 320 exec/s: 0 rss: 69Mb L: 123/123 MS: 1 ChangeBinInt- 00:09:21.578 [2024-06-11 12:03:34.483238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71757171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.578 [2024-06-11 12:03:34.483265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.578 #46 NEW cov: 11691 ft: 13330 corp: 13/1103b lim: 320 exec/s: 0 rss: 69Mb L: 86/123 MS: 1 ChangeBit- 00:09:21.578 [2024-06-11 12:03:34.513292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717157 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.578 [2024-06-11 12:03:34.513320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.578 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:21.579 #47 NEW cov: 11714 ft: 13366 corp: 14/1190b lim: 320 exec/s: 0 rss: 69Mb L: 87/123 MS: 1 InsertByte- 00:09:21.579 [2024-06-11 12:03:34.553418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.579 [2024-06-11 12:03:34.553448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.579 #48 NEW cov: 11714 ft: 13382 corp: 15/1276b lim: 320 exec/s: 0 rss: 69Mb L: 86/123 MS: 1 ChangeBinInt- 00:09:21.579 [2024-06-11 12:03:34.583566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.579 [2024-06-11 12:03:34.583593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.579 #49 NEW cov: 11714 ft: 13409 corp: 16/1362b lim: 320 exec/s: 0 rss: 69Mb L: 86/123 MS: 1 ChangeBinInt- 00:09:21.837 [2024-06-11 12:03:34.623636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0xbdbdbdbdbdbdbdbd 00:09:21.837 [2024-06-11 12:03:34.623662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.837 #50 NEW cov: 11714 ft: 13415 corp: 17/1465b lim: 320 exec/s: 50 rss: 69Mb L: 103/123 MS: 1 EraseBytes- 00:09:21.837 [2024-06-11 12:03:34.663692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:20000000 cdw11:00000024 00:09:21.837 [2024-06-11 12:03:34.663720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.837 #51 NEW cov: 11714 ft: 13461 corp: 18/1538b lim: 320 exec/s: 51 rss: 69Mb L: 73/123 MS: 1 InsertByte- 00:09:21.837 [2024-06-11 12:03:34.703914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717175 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.837 [2024-06-11 12:03:34.703941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.837 #52 NEW cov: 11714 ft: 13506 corp: 19/1624b lim: 320 exec/s: 52 rss: 70Mb L: 86/123 MS: 1 CopyPart- 00:09:21.837 [2024-06-11 12:03:34.744024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:20aadf3 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.837 [2024-06-11 12:03:34.744050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.837 #53 NEW cov: 11714 ft: 13528 corp: 20/1705b lim: 320 exec/s: 53 rss: 70Mb L: 81/123 MS: 1 EraseBytes- 00:09:21.837 [2024-06-11 12:03:34.784330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.837 [2024-06-11 12:03:34.784356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.837 [2024-06-11 12:03:34.784423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:5 nsid:71717171 cdw10:f3f3f3f3 cdw11:f3f3f3f3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.837 [2024-06-11 12:03:34.784437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:21.837 #54 NEW cov: 11714 ft: 14214 corp: 21/1860b lim: 320 exec/s: 54 rss: 70Mb L: 155/155 MS: 1 InsertRepeatedBytes- 00:09:21.837 [2024-06-11 12:03:34.824268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.837 [2024-06-11 12:03:34.824293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.837 #55 NEW cov: 11714 ft: 14218 corp: 22/1926b lim: 320 exec/s: 55 rss: 70Mb L: 66/155 MS: 1 EraseBytes- 00:09:21.837 [2024-06-11 12:03:34.864476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.837 [2024-06-11 12:03:34.864502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 #56 NEW cov: 11714 ft: 14222 corp: 23/2020b lim: 320 exec/s: 56 rss: 70Mb L: 94/155 MS: 1 PersAutoDict- DE: "\363\255\012\002\000\000\000\000"- 00:09:22.096 [2024-06-11 12:03:34.894674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717157 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.096 [2024-06-11 12:03:34.894700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 [2024-06-11 12:03:34.894761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:5 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.096 [2024-06-11 12:03:34.894775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.096 #57 NEW cov: 11714 ft: 14254 corp: 24/2193b lim: 320 exec/s: 57 rss: 70Mb L: 173/173 MS: 1 CrossOver- 00:09:22.096 [2024-06-11 12:03:34.934578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717157 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.096 [2024-06-11 12:03:34.934603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 #63 NEW cov: 11714 ft: 14266 corp: 25/2280b lim: 320 exec/s: 63 rss: 70Mb L: 87/173 MS: 1 ChangeBinInt- 00:09:22.096 [2024-06-11 12:03:34.974690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:20f70000 cdw11:00000024 00:09:22.096 [2024-06-11 12:03:34.974715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 #64 NEW cov: 11714 ft: 14285 corp: 26/2353b lim: 320 exec/s: 64 rss: 70Mb L: 73/173 MS: 1 ChangeBinInt- 00:09:22.096 [2024-06-11 12:03:35.014865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71757171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.096 [2024-06-11 12:03:35.014890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 #65 NEW cov: 11714 ft: 14288 corp: 27/2439b lim: 320 exec/s: 65 rss: 70Mb L: 86/173 MS: 1 CopyPart- 00:09:22.096 [2024-06-11 12:03:35.055001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.096 [2024-06-11 12:03:35.055027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 #66 NEW cov: 11714 ft: 14324 corp: 28/2511b lim: 320 exec/s: 66 rss: 70Mb L: 72/173 MS: 1 ChangeBinInt- 00:09:22.096 [2024-06-11 12:03:35.095102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:20aadf3 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.096 [2024-06-11 12:03:35.095128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.096 #67 NEW cov: 11714 ft: 14357 corp: 29/2597b lim: 320 exec/s: 67 rss: 70Mb L: 86/173 MS: 1 ChangeBinInt- 00:09:22.355 [2024-06-11 12:03:35.135166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:20f70000 cdw11:00000024 00:09:22.355 [2024-06-11 12:03:35.135191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.355 #68 NEW cov: 11714 ft: 14361 corp: 30/2670b lim: 320 exec/s: 68 rss: 70Mb L: 73/173 MS: 1 ChangeBit- 00:09:22.355 [2024-06-11 12:03:35.175354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717126 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.355 [2024-06-11 12:03:35.175387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.355 #69 NEW cov: 11714 ft: 14401 corp: 31/2757b lim: 320 exec/s: 69 rss: 70Mb L: 87/173 MS: 1 InsertByte- 00:09:22.355 [2024-06-11 12:03:35.215470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.355 [2024-06-11 12:03:35.215497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.355 #70 NEW cov: 11714 ft: 14429 corp: 32/2829b lim: 320 exec/s: 70 rss: 70Mb L: 72/173 MS: 1 PersAutoDict- DE: "\363\255\012\002\000\000\000\000"- 00:09:22.355 [2024-06-11 12:03:35.255563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71757171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.355 [2024-06-11 12:03:35.255590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.355 #71 NEW cov: 11714 ft: 14455 corp: 33/2894b lim: 320 exec/s: 71 rss: 71Mb L: 65/173 MS: 1 EraseBytes- 00:09:22.355 [2024-06-11 12:03:35.295702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717175 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.355 [2024-06-11 12:03:35.295726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.355 #72 NEW cov: 11714 ft: 14485 corp: 34/2980b lim: 320 exec/s: 72 rss: 71Mb L: 86/173 MS: 1 ChangeBinInt- 00:09:22.355 [2024-06-11 12:03:35.335973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:71717171 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7171717171717171 00:09:22.355 [2024-06-11 12:03:35.335998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.355 [2024-06-11 12:03:35.336059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:5 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.355 [2024-06-11 12:03:35.336074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.355 #73 NEW cov: 11714 ft: 14505 corp: 35/3139b lim: 320 exec/s: 73 rss: 71Mb L: 159/173 MS: 1 CopyPart- 00:09:22.355 [2024-06-11 12:03:35.375929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.355 [2024-06-11 12:03:35.375956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 #74 NEW cov: 11714 ft: 14509 corp: 36/3225b lim: 320 exec/s: 74 rss: 71Mb L: 86/173 MS: 1 ChangeByte- 00:09:22.614 [2024-06-11 12:03:35.416003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.614 [2024-06-11 12:03:35.416029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 #75 NEW cov: 11714 ft: 14529 corp: 37/3311b lim: 320 exec/s: 75 rss: 71Mb L: 86/173 MS: 1 ChangeBit- 00:09:22.614 [2024-06-11 12:03:35.446308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:20aadf3 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.614 [2024-06-11 12:03:35.446333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 [2024-06-11 12:03:35.446396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:5 nsid:71717171 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.614 [2024-06-11 12:03:35.446411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.614 #76 NEW cov: 11714 ft: 14537 corp: 38/3445b lim: 320 exec/s: 76 rss: 71Mb L: 134/173 MS: 1 InsertRepeatedBytes- 00:09:22.614 [2024-06-11 12:03:35.486234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:ecececec cdw11:ecececec 00:09:22.614 [2024-06-11 12:03:35.486260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 #81 NEW cov: 11714 ft: 14544 corp: 39/3544b lim: 320 exec/s: 81 rss: 71Mb L: 99/173 MS: 5 EraseBytes-ChangeByte-ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:09:22.614 [2024-06-11 12:03:35.526539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:20aadf3 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.614 [2024-06-11 12:03:35.526565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 [2024-06-11 12:03:35.526626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:5 nsid:71717171 cdw10:71717671 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.614 [2024-06-11 12:03:35.526640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:22.614 #82 NEW cov: 11714 ft: 14556 corp: 40/3685b lim: 320 exec/s: 82 rss: 71Mb L: 141/173 MS: 1 CopyPart- 00:09:22.614 [2024-06-11 12:03:35.566435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:22.614 [2024-06-11 12:03:35.566463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 #83 NEW cov: 11714 ft: 14629 corp: 41/3765b lim: 320 exec/s: 83 rss: 71Mb L: 80/173 MS: 1 PersAutoDict- DE: "\363\255\012\002\000\000\000\000"- 00:09:22.614 [2024-06-11 12:03:35.606574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (71) qid:0 cid:4 nsid:71717171 cdw10:71717171 cdw11:71717171 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:22.614 [2024-06-11 12:03:35.606602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:22.614 #84 NEW cov: 11714 ft: 14631 corp: 42/3888b lim: 320 exec/s: 42 rss: 71Mb L: 123/173 MS: 1 ShuffleBytes- 00:09:22.614 #84 DONE cov: 11714 ft: 14631 corp: 42/3888b lim: 320 exec/s: 42 rss: 71Mb 00:09:22.614 ###### Recommended dictionary. ###### 00:09:22.614 "\363\255\012\002\000\000\000\000" # Uses: 3 00:09:22.614 ###### End of recommended dictionary. ###### 00:09:22.614 Done 84 runs in 2 second(s) 00:09:22.873 12:03:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:09:22.873 12:03:35 -- ../common.sh@72 -- # (( i++ )) 00:09:22.873 12:03:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:22.873 12:03:35 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:22.873 12:03:35 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:09:22.873 12:03:35 -- nvmf/run.sh@24 -- # local timen=1 00:09:22.873 12:03:35 -- nvmf/run.sh@25 -- # local core=0x1 00:09:22.873 12:03:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:09:22.873 12:03:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:09:22.873 12:03:35 -- nvmf/run.sh@29 -- # printf %02d 1 00:09:22.873 12:03:35 -- nvmf/run.sh@29 -- # port=4401 00:09:22.873 12:03:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:09:22.873 12:03:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:09:22.873 12:03:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:22.873 12:03:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:09:22.873 [2024-06-11 12:03:35.809697] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:22.873 [2024-06-11 12:03:35.809767] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2691448 ] 00:09:22.873 EAL: No free 2048 kB hugepages reported on node 1 00:09:23.132 [2024-06-11 12:03:36.066325] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.132 [2024-06-11 12:03:36.093122] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:23.132 [2024-06-11 12:03:36.093296] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.132 [2024-06-11 12:03:36.147777] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:23.132 [2024-06-11 12:03:36.164017] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:09:23.391 INFO: Running with entropic power schedule (0xFF, 100). 00:09:23.391 INFO: Seed: 840001491 00:09:23.391 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:23.391 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:23.391 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:09:23.391 INFO: A corpus is not provided, starting from an empty corpus 00:09:23.391 #2 INITED exec/s: 0 rss: 61Mb 00:09:23.391 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:23.391 This may also happen if the target rejected all inputs we tried so far 00:09:23.391 [2024-06-11 12:03:36.219400] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:23.391 [2024-06-11 12:03:36.219670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0245 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.391 [2024-06-11 12:03:36.219708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.649 NEW_FUNC[1/664]: 0x49f000 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:09:23.649 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:23.649 #27 NEW cov: 11531 ft: 11523 corp: 2/7b lim: 30 exec/s: 0 rss: 68Mb L: 6/6 MS: 5 ChangeByte-ShuffleBytes-CrossOver-InsertByte-CopyPart- 00:09:23.908 [2024-06-11 12:03:36.690545] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000450a 00:09:23.908 [2024-06-11 12:03:36.690794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.908 [2024-06-11 12:03:36.690837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.908 #28 NEW cov: 11644 ft: 11896 corp: 3/17b lim: 30 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:09:23.908 [2024-06-11 12:03:36.750603] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff45 00:09:23.908 [2024-06-11 12:03:36.750845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.908 [2024-06-11 12:03:36.750882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.908 #29 NEW cov: 11650 ft: 12175 corp: 4/28b lim: 30 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 CrossOver- 00:09:23.908 [2024-06-11 12:03:36.810774] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:23.908 [2024-06-11 12:03:36.811008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.908 [2024-06-11 12:03:36.811041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.908 #30 NEW cov: 11735 ft: 12400 corp: 5/34b lim: 30 exec/s: 0 rss: 69Mb L: 6/11 MS: 1 ChangeBit- 00:09:23.908 [2024-06-11 12:03:36.860875] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000450a 00:09:23.908 [2024-06-11 12:03:36.861124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.908 [2024-06-11 12:03:36.861158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.908 #31 NEW cov: 11735 ft: 12448 corp: 6/44b lim: 30 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 CopyPart- 00:09:23.908 [2024-06-11 12:03:36.910990] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ed29 00:09:23.908 [2024-06-11 12:03:36.911255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0245 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.908 [2024-06-11 12:03:36.911290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.908 #32 NEW cov: 11735 ft: 12521 corp: 7/51b lim: 30 exec/s: 0 rss: 69Mb L: 7/11 MS: 1 InsertByte- 00:09:24.167 [2024-06-11 12:03:36.961127] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000450a 00:09:24.167 [2024-06-11 12:03:36.961378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.167 [2024-06-11 12:03:36.961411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.167 #33 NEW cov: 11735 ft: 12599 corp: 8/62b lim: 30 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 CrossOver- 00:09:24.167 [2024-06-11 12:03:37.021342] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c9c9 00:09:24.167 [2024-06-11 12:03:37.021598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac981c9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.167 [2024-06-11 12:03:37.021632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.167 #34 NEW cov: 11735 ft: 12750 corp: 9/71b lim: 30 exec/s: 0 rss: 69Mb L: 9/11 MS: 1 InsertRepeatedBytes- 00:09:24.167 [2024-06-11 12:03:37.071441] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000aed 00:09:24.167 [2024-06-11 12:03:37.071691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4545810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.167 [2024-06-11 12:03:37.071725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.167 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:24.167 #35 NEW cov: 11758 ft: 12917 corp: 10/77b lim: 30 exec/s: 0 rss: 69Mb L: 6/11 MS: 1 CopyPart- 00:09:24.167 [2024-06-11 12:03:37.121616] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff45 00:09:24.167 [2024-06-11 12:03:37.121754] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:24.167 [2024-06-11 12:03:37.121995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.167 [2024-06-11 12:03:37.122030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.167 [2024-06-11 12:03:37.122096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0ab00245 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.167 [2024-06-11 12:03:37.122115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.167 #41 NEW cov: 11758 ft: 13377 corp: 11/89b lim: 30 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 InsertByte- 00:09:24.167 [2024-06-11 12:03:37.171756] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000045ed 00:09:24.167 [2024-06-11 12:03:37.172015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffed0245 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.167 [2024-06-11 12:03:37.172048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.426 #42 NEW cov: 11758 ft: 13403 corp: 12/99b lim: 30 exec/s: 42 rss: 69Mb L: 10/12 MS: 1 ShuffleBytes- 00:09:24.427 [2024-06-11 12:03:37.231948] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:24.427 [2024-06-11 12:03:37.232195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0255 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.232236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.427 #48 NEW cov: 11758 ft: 13420 corp: 13/105b lim: 30 exec/s: 48 rss: 69Mb L: 6/12 MS: 1 ChangeBit- 00:09:24.427 [2024-06-11 12:03:37.272140] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.427 [2024-06-11 12:03:37.272272] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.427 [2024-06-11 12:03:37.272407] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.427 [2024-06-11 12:03:37.272526] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c9c9 00:09:24.427 [2024-06-11 12:03:37.272782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac90256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.272823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.427 [2024-06-11 12:03:37.272892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.272911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.427 [2024-06-11 12:03:37.272976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.272994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.427 [2024-06-11 12:03:37.273057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c9c981c9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.273075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.427 #49 NEW cov: 11758 ft: 13967 corp: 14/130b lim: 30 exec/s: 49 rss: 69Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:09:24.427 [2024-06-11 12:03:37.332254] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c9c9 00:09:24.427 [2024-06-11 12:03:37.332501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac981c9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.332536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.427 #50 NEW cov: 11758 ft: 13996 corp: 15/139b lim: 30 exec/s: 50 rss: 69Mb L: 9/25 MS: 1 CopyPart- 00:09:24.427 [2024-06-11 12:03:37.382422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff45 00:09:24.427 [2024-06-11 12:03:37.382552] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:24.427 [2024-06-11 12:03:37.382786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.382821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.427 [2024-06-11 12:03:37.382887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0ab00245 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.382908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.427 #51 NEW cov: 11758 ft: 14006 corp: 16/151b lim: 30 exec/s: 51 rss: 69Mb L: 12/25 MS: 1 ShuffleBytes- 00:09:24.427 [2024-06-11 12:03:37.442533] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:24.427 [2024-06-11 12:03:37.442787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b6f50255 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.427 [2024-06-11 12:03:37.442821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.686 #52 NEW cov: 11758 ft: 14038 corp: 17/157b lim: 30 exec/s: 52 rss: 70Mb L: 6/25 MS: 1 ChangeBinInt- 00:09:24.686 [2024-06-11 12:03:37.502731] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000b6ed 00:09:24.686 [2024-06-11 12:03:37.502964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:edf50255 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.502997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.686 #53 NEW cov: 11758 ft: 14051 corp: 18/163b lim: 30 exec/s: 53 rss: 70Mb L: 6/25 MS: 1 ShuffleBytes- 00:09:24.686 [2024-06-11 12:03:37.563027] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.686 [2024-06-11 12:03:37.563162] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.686 [2024-06-11 12:03:37.563284] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.686 [2024-06-11 12:03:37.563412] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c9c9 00:09:24.686 [2024-06-11 12:03:37.563650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac90256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.563684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.686 [2024-06-11 12:03:37.563750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.563770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.686 [2024-06-11 12:03:37.563832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.563851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.686 [2024-06-11 12:03:37.563913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c9c981c9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.563932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.686 #54 NEW cov: 11758 ft: 14082 corp: 19/188b lim: 30 exec/s: 54 rss: 70Mb L: 25/25 MS: 1 ChangeBinInt- 00:09:24.686 [2024-06-11 12:03:37.623088] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000ab6 00:09:24.686 [2024-06-11 12:03:37.623324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:f5ed81f5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.623364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.686 #55 NEW cov: 11758 ft: 14102 corp: 20/195b lim: 30 exec/s: 55 rss: 70Mb L: 7/25 MS: 1 CopyPart- 00:09:24.686 [2024-06-11 12:03:37.683229] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000550a 00:09:24.686 [2024-06-11 12:03:37.683480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:f5ed83f5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.686 [2024-06-11 12:03:37.683515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.945 #56 NEW cov: 11758 ft: 14103 corp: 21/203b lim: 30 exec/s: 56 rss: 70Mb L: 8/25 MS: 1 InsertByte- 00:09:24.945 [2024-06-11 12:03:37.743458] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10280) > buf size (4096) 00:09:24.945 [2024-06-11 12:03:37.743588] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:09:24.945 [2024-06-11 12:03:37.743713] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x450a 00:09:24.945 [2024-06-11 12:03:37.743936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a090000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.743970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.945 [2024-06-11 12:03:37.744034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.744055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.945 [2024-06-11 12:03:37.744119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff45000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.744142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.945 #57 NEW cov: 11781 ft: 14355 corp: 22/223b lim: 30 exec/s: 57 rss: 70Mb L: 20/25 MS: 1 CMP- DE: "\011\000\000\000\000\000\000\000"- 00:09:24.945 [2024-06-11 12:03:37.793701] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.945 [2024-06-11 12:03:37.793836] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.945 [2024-06-11 12:03:37.793962] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200005656 00:09:24.945 [2024-06-11 12:03:37.794087] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c9c9 00:09:24.945 [2024-06-11 12:03:37.794322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac90256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.794356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.945 [2024-06-11 12:03:37.794436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:001c0256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.794455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.945 [2024-06-11 12:03:37.794519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:56560256 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.794538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.945 [2024-06-11 12:03:37.794602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c9c981c9 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.945 [2024-06-11 12:03:37.794621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.945 #58 NEW cov: 11781 ft: 14374 corp: 23/248b lim: 30 exec/s: 58 rss: 70Mb L: 25/25 MS: 1 CMP- DE: "\000\034"- 00:09:24.945 [2024-06-11 12:03:37.843745] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (594988) > buf size (4096) 00:09:24.946 [2024-06-11 12:03:37.843876] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000950f 00:09:24.946 [2024-06-11 12:03:37.844129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.946 [2024-06-11 12:03:37.844162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.946 [2024-06-11 12:03:37.844229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:859f02cd cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.946 [2024-06-11 12:03:37.844249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.946 #64 NEW cov: 11781 ft: 14431 corp: 24/262b lim: 30 exec/s: 64 rss: 70Mb L: 14/25 MS: 1 CMP- DE: "l\205\237\315\022\225\017\000"- 00:09:24.946 [2024-06-11 12:03:37.903909] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (594988) > buf size (4096) 00:09:24.946 [2024-06-11 12:03:37.904044] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000950f 00:09:24.946 [2024-06-11 12:03:37.904281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.946 [2024-06-11 12:03:37.904315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.946 [2024-06-11 12:03:37.904384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:859f02cd cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.946 [2024-06-11 12:03:37.904407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.946 #65 NEW cov: 11781 ft: 14440 corp: 25/276b lim: 30 exec/s: 65 rss: 70Mb L: 14/25 MS: 1 CopyPart- 00:09:24.946 [2024-06-11 12:03:37.964066] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (594988) > buf size (4096) 00:09:24.946 [2024-06-11 12:03:37.964195] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000120f 00:09:24.946 [2024-06-11 12:03:37.964438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:450a0205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.946 [2024-06-11 12:03:37.964471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.946 [2024-06-11 12:03:37.964538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:859f8195 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.946 [2024-06-11 12:03:37.964558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.205 #66 NEW cov: 11781 ft: 14471 corp: 26/290b lim: 30 exec/s: 66 rss: 70Mb L: 14/25 MS: 1 ShuffleBytes- 00:09:25.205 [2024-06-11 12:03:38.014186] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100001295 00:09:25.205 [2024-06-11 12:03:38.014318] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000b0b 00:09:25.205 [2024-06-11 12:03:38.014571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:6c85819f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.014605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.205 [2024-06-11 12:03:38.014673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0f008121 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.014693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.205 #70 NEW cov: 11781 ft: 14548 corp: 27/302b lim: 30 exec/s: 70 rss: 70Mb L: 12/25 MS: 4 ChangeBit-InsertByte-CopyPart-PersAutoDict- DE: "l\205\237\315\022\225\017\000"- 00:09:25.205 [2024-06-11 12:03:38.064305] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:25.205 [2024-06-11 12:03:38.064558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00060205 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.064590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.205 #71 NEW cov: 11781 ft: 14559 corp: 28/308b lim: 30 exec/s: 71 rss: 70Mb L: 6/25 MS: 1 ChangeBinInt- 00:09:25.205 [2024-06-11 12:03:38.114580] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff45 00:09:25.205 [2024-06-11 12:03:38.114710] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006161 00:09:25.205 [2024-06-11 12:03:38.114833] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006161 00:09:25.205 [2024-06-11 12:03:38.114951] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006161 00:09:25.205 [2024-06-11 12:03:38.115203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.115236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.205 [2024-06-11 12:03:38.115305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0a458161 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.115324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.205 [2024-06-11 12:03:38.115398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:61618161 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.115418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:25.205 [2024-06-11 12:03:38.115483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:61618161 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.115501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:25.205 #72 NEW cov: 11781 ft: 14589 corp: 29/335b lim: 30 exec/s: 72 rss: 70Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:09:25.205 [2024-06-11 12:03:38.164603] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000450a 00:09:25.205 [2024-06-11 12:03:38.164734] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:09:25.205 [2024-06-11 12:03:38.164977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff45810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.165011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.205 [2024-06-11 12:03:38.165076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:45ed02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.205 [2024-06-11 12:03:38.165096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.205 #73 NEW cov: 11781 ft: 14613 corp: 30/351b lim: 30 exec/s: 36 rss: 70Mb L: 16/27 MS: 1 CrossOver- 00:09:25.205 #73 DONE cov: 11781 ft: 14613 corp: 30/351b lim: 30 exec/s: 36 rss: 70Mb 00:09:25.205 ###### Recommended dictionary. ###### 00:09:25.205 "\011\000\000\000\000\000\000\000" # Uses: 0 00:09:25.205 "\000\034" # Uses: 0 00:09:25.205 "l\205\237\315\022\225\017\000" # Uses: 1 00:09:25.205 ###### End of recommended dictionary. ###### 00:09:25.205 Done 73 runs in 2 second(s) 00:09:25.464 12:03:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:09:25.464 12:03:38 -- ../common.sh@72 -- # (( i++ )) 00:09:25.464 12:03:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:25.464 12:03:38 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:25.464 12:03:38 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:09:25.464 12:03:38 -- nvmf/run.sh@24 -- # local timen=1 00:09:25.464 12:03:38 -- nvmf/run.sh@25 -- # local core=0x1 00:09:25.464 12:03:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:09:25.464 12:03:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:09:25.464 12:03:38 -- nvmf/run.sh@29 -- # printf %02d 2 00:09:25.464 12:03:38 -- nvmf/run.sh@29 -- # port=4402 00:09:25.464 12:03:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:09:25.464 12:03:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:09:25.464 12:03:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:25.464 12:03:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:09:25.464 [2024-06-11 12:03:38.378193] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:25.464 [2024-06-11 12:03:38.378279] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2691734 ] 00:09:25.464 EAL: No free 2048 kB hugepages reported on node 1 00:09:25.723 [2024-06-11 12:03:38.636872] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.723 [2024-06-11 12:03:38.663406] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:25.723 [2024-06-11 12:03:38.663582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:25.723 [2024-06-11 12:03:38.718127] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:25.723 [2024-06-11 12:03:38.734374] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:09:25.723 INFO: Running with entropic power schedule (0xFF, 100). 00:09:25.723 INFO: Seed: 3409007022 00:09:25.982 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:25.982 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:25.982 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:09:25.982 INFO: A corpus is not provided, starting from an empty corpus 00:09:25.982 #2 INITED exec/s: 0 rss: 61Mb 00:09:25.982 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:25.982 This may also happen if the target rejected all inputs we tried so far 00:09:25.982 [2024-06-11 12:03:38.789785] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:25.982 [2024-06-11 12:03:38.789933] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:25.982 [2024-06-11 12:03:38.790064] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:25.982 [2024-06-11 12:03:38.790321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.982 [2024-06-11 12:03:38.790372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:25.982 [2024-06-11 12:03:38.790444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.982 [2024-06-11 12:03:38.790467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:25.982 [2024-06-11 12:03:38.790534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:25.983 [2024-06-11 12:03:38.790556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.241 NEW_FUNC[1/663]: 0x4a1a20 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:09:26.241 NEW_FUNC[2/663]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:26.241 #6 NEW cov: 11488 ft: 11489 corp: 2/25b lim: 35 exec/s: 0 rss: 68Mb L: 24/24 MS: 4 ShuffleBytes-CopyPart-CrossOver-InsertRepeatedBytes- 00:09:26.241 [2024-06-11 12:03:39.261856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.241 [2024-06-11 12:03:39.261916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.241 [2024-06-11 12:03:39.262003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.241 [2024-06-11 12:03:39.262031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.242 [2024-06-11 12:03:39.262114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.242 [2024-06-11 12:03:39.262140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.242 [2024-06-11 12:03:39.262219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.242 [2024-06-11 12:03:39.262251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.242 [2024-06-11 12:03:39.262332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b500b5 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.242 [2024-06-11 12:03:39.262365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.501 #10 NEW cov: 11611 ft: 12496 corp: 3/60b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 4 ChangeBit-InsertByte-InsertByte-InsertRepeatedBytes- 00:09:26.501 [2024-06-11 12:03:39.321752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.321788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.321859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.321878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.321943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.321963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.322029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.322047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.322115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b500b5 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.322135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.501 #11 NEW cov: 11617 ft: 12718 corp: 4/95b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:26.501 [2024-06-11 12:03:39.381622] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:26.501 [2024-06-11 12:03:39.381884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.381917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.381985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.382004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.382074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.382093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.382158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.382176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.382243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.382268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.501 #12 NEW cov: 11702 ft: 13081 corp: 5/130b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:26.501 [2024-06-11 12:03:39.431811] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:26.501 [2024-06-11 12:03:39.432090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.432125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.432193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.432214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.432278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.432297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.432369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b5000a cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.432390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.432454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.432475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.501 #13 NEW cov: 11702 ft: 13187 corp: 6/165b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:09:26.501 [2024-06-11 12:03:39.492156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.492192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.501 [2024-06-11 12:03:39.492257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.501 [2024-06-11 12:03:39.492277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.502 [2024-06-11 12:03:39.492343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.502 [2024-06-11 12:03:39.492366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.502 [2024-06-11 12:03:39.492432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.502 [2024-06-11 12:03:39.492451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.502 [2024-06-11 12:03:39.492517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b500b5 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.502 [2024-06-11 12:03:39.492536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.761 #14 NEW cov: 11702 ft: 13246 corp: 7/200b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:09:26.761 [2024-06-11 12:03:39.551741] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:26.761 [2024-06-11 12:03:39.551881] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:26.761 [2024-06-11 12:03:39.552139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000f9 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.552173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.552240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.552262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.552327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.552348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.761 #20 NEW cov: 11702 ft: 13321 corp: 8/224b lim: 35 exec/s: 0 rss: 69Mb L: 24/35 MS: 1 ChangeByte- 00:09:26.761 [2024-06-11 12:03:39.612296] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:26.761 [2024-06-11 12:03:39.612573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.612608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.612676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.612695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.612761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.612780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.612846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.612864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.612928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.612949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.761 #21 NEW cov: 11702 ft: 13352 corp: 9/259b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:09:26.761 [2024-06-11 12:03:39.662691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.662726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.662793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.662812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.662878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.662901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.662967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.662985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.663049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b500b5 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.663068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.761 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:26.761 #22 NEW cov: 11725 ft: 13394 corp: 10/294b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:09:26.761 [2024-06-11 12:03:39.712578] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:26.761 [2024-06-11 12:03:39.712853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.712889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.712957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.712979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.713043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.713062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.713130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b5000a cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.713150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.713215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.713237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:26.761 #23 NEW cov: 11725 ft: 13422 corp: 11/329b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:26.761 [2024-06-11 12:03:39.773013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.773049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:26.761 [2024-06-11 12:03:39.773118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.761 [2024-06-11 12:03:39.773138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:26.762 [2024-06-11 12:03:39.773205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:230000b5 cdw11:b5000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.762 [2024-06-11 12:03:39.773223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:26.762 [2024-06-11 12:03:39.773294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.762 [2024-06-11 12:03:39.773313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:26.762 [2024-06-11 12:03:39.773383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b500b5 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.762 [2024-06-11 12:03:39.773404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.020 #24 NEW cov: 11725 ft: 13433 corp: 12/364b lim: 35 exec/s: 24 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:27.020 [2024-06-11 12:03:39.832974] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.020 [2024-06-11 12:03:39.833228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.020 [2024-06-11 12:03:39.833263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.020 [2024-06-11 12:03:39.833330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.020 [2024-06-11 12:03:39.833349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.833423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.833442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.833508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b5000a cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.833527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.833590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.833611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.021 #25 NEW cov: 11725 ft: 13446 corp: 13/399b lim: 35 exec/s: 25 rss: 69Mb L: 35/35 MS: 1 ChangeASCIIInt- 00:09:27.021 [2024-06-11 12:03:39.892791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.892827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.021 #28 NEW cov: 11725 ft: 13852 corp: 14/411b lim: 35 exec/s: 28 rss: 69Mb L: 12/35 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:09:27.021 [2024-06-11 12:03:39.943516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.943551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.943621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.943641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.943705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.943729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.943794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.943813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.943881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b50023 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.943899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.021 #29 NEW cov: 11725 ft: 13984 corp: 15/446b lim: 35 exec/s: 29 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:27.021 [2024-06-11 12:03:39.993630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.993665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.993735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.993755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.993822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.993841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.993903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff7e00ff cdw11:0e00af2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.993922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.021 [2024-06-11 12:03:39.993987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:000000c7 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.021 [2024-06-11 12:03:39.994005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.021 #30 NEW cov: 11725 ft: 14004 corp: 16/481b lim: 35 exec/s: 30 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\377\377~\257,\016&\307"- 00:09:27.280 [2024-06-11 12:03:40.053874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.053913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.280 [2024-06-11 12:03:40.053982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.054003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.280 [2024-06-11 12:03:40.054068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.054087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.280 [2024-06-11 12:03:40.054151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.054170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.280 [2024-06-11 12:03:40.054242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b500b5 cdw11:2a00b586 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.054260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.280 #31 NEW cov: 11725 ft: 14015 corp: 17/516b lim: 35 exec/s: 31 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:09:27.280 [2024-06-11 12:03:40.103904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b5000a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.103939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.280 [2024-06-11 12:03:40.104010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.104030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.280 [2024-06-11 12:03:40.104097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.280 [2024-06-11 12:03:40.104117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.281 #34 NEW cov: 11725 ft: 14064 corp: 18/542b lim: 35 exec/s: 34 rss: 70Mb L: 26/35 MS: 3 ChangeBit-ChangeBit-CrossOver- 00:09:27.281 [2024-06-11 12:03:40.143569] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.281 [2024-06-11 12:03:40.143711] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.281 [2024-06-11 12:03:40.143838] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.281 [2024-06-11 12:03:40.144068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.144102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.144170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.144192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.144256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.144277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.281 #35 NEW cov: 11725 ft: 14102 corp: 19/566b lim: 35 exec/s: 35 rss: 70Mb L: 24/35 MS: 1 ChangeBit- 00:09:27.281 [2024-06-11 12:03:40.194164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b5000a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.194198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.194267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.194287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.194351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.194376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.281 #36 NEW cov: 11725 ft: 14142 corp: 20/592b lim: 35 exec/s: 36 rss: 70Mb L: 26/35 MS: 1 ShuffleBytes- 00:09:27.281 [2024-06-11 12:03:40.254376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b5000a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.254410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.254478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.254498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.254565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.254584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.281 #37 NEW cov: 11725 ft: 14180 corp: 21/618b lim: 35 exec/s: 37 rss: 70Mb L: 26/35 MS: 1 CopyPart- 00:09:27.281 [2024-06-11 12:03:40.304579] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.281 [2024-06-11 12:03:40.304853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.304887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.304957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.304976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.305042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.305061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.305125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.305144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.281 [2024-06-11 12:03:40.305209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a002923 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.281 [2024-06-11 12:03:40.305230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.540 #38 NEW cov: 11725 ft: 14185 corp: 22/653b lim: 35 exec/s: 38 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:09:27.540 [2024-06-11 12:03:40.354748] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.540 [2024-06-11 12:03:40.355001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.355035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.355102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.355122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.355187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.355210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.355276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b5000a cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.355295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.355365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.355386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.540 #39 NEW cov: 11725 ft: 14217 corp: 23/688b lim: 35 exec/s: 39 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:27.540 [2024-06-11 12:03:40.414893] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.540 [2024-06-11 12:03:40.415154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.415187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.415256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.415275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.415340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b5001e cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.415364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.415430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b5000a cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.415449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.540 [2024-06-11 12:03:40.415516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.540 [2024-06-11 12:03:40.415536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.540 #40 NEW cov: 11725 ft: 14228 corp: 24/723b lim: 35 exec/s: 40 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\377\036"- 00:09:27.540 [2024-06-11 12:03:40.464646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00fb cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.541 [2024-06-11 12:03:40.464680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.541 #41 NEW cov: 11725 ft: 14247 corp: 25/735b lim: 35 exec/s: 41 rss: 70Mb L: 12/35 MS: 1 ChangeBit- 00:09:27.541 [2024-06-11 12:03:40.525213] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.541 [2024-06-11 12:03:40.525478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.541 [2024-06-11 12:03:40.525513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.541 [2024-06-11 12:03:40.525583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.541 [2024-06-11 12:03:40.525607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.541 [2024-06-11 12:03:40.525675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.541 [2024-06-11 12:03:40.525694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.541 [2024-06-11 12:03:40.525759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b5000a cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.541 [2024-06-11 12:03:40.525778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.541 [2024-06-11 12:03:40.525844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:2a000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.541 [2024-06-11 12:03:40.525864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.541 #42 NEW cov: 11725 ft: 14321 corp: 26/770b lim: 35 exec/s: 42 rss: 70Mb L: 35/35 MS: 1 ChangeASCIIInt- 00:09:27.800 [2024-06-11 12:03:40.585061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b5000ab5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.585098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.800 #43 NEW cov: 11725 ft: 14335 corp: 27/780b lim: 35 exec/s: 43 rss: 70Mb L: 10/35 MS: 1 CrossOver- 00:09:27.800 [2024-06-11 12:03:40.645104] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.645246] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.645385] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.645512] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.645752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.645788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.645855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.645878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.645944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.645964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.646031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0a000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.646052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.800 #44 NEW cov: 11725 ft: 14455 corp: 28/809b lim: 35 exec/s: 44 rss: 70Mb L: 29/35 MS: 1 InsertRepeatedBytes- 00:09:27.800 [2024-06-11 12:03:40.705251] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.705499] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.705744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.705784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.705850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000018 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.705870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.705933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.705954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.800 #45 NEW cov: 11725 ft: 14479 corp: 29/833b lim: 35 exec/s: 45 rss: 70Mb L: 24/35 MS: 1 ChangeBinInt- 00:09:27.800 [2024-06-11 12:03:40.755866] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:09:27.800 [2024-06-11 12:03:40.756150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.756185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.756254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.756274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.756340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500b5 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.756365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.756429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:b5b500b5 cdw11:0000b500 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.756448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:27.800 [2024-06-11 12:03:40.756513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:b5b50000 cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.800 [2024-06-11 12:03:40.756534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:27.800 #46 NEW cov: 11725 ft: 14496 corp: 30/868b lim: 35 exec/s: 23 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:09:27.800 #46 DONE cov: 11725 ft: 14496 corp: 30/868b lim: 35 exec/s: 23 rss: 70Mb 00:09:27.800 ###### Recommended dictionary. ###### 00:09:27.800 "\377\377~\257,\016&\307" # Uses: 0 00:09:27.800 "\377\036" # Uses: 0 00:09:27.800 ###### End of recommended dictionary. ###### 00:09:27.800 Done 46 runs in 2 second(s) 00:09:28.059 12:03:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:09:28.059 12:03:40 -- ../common.sh@72 -- # (( i++ )) 00:09:28.059 12:03:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:28.059 12:03:40 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:28.059 12:03:40 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:09:28.059 12:03:40 -- nvmf/run.sh@24 -- # local timen=1 00:09:28.059 12:03:40 -- nvmf/run.sh@25 -- # local core=0x1 00:09:28.059 12:03:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:09:28.059 12:03:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:09:28.059 12:03:40 -- nvmf/run.sh@29 -- # printf %02d 3 00:09:28.059 12:03:40 -- nvmf/run.sh@29 -- # port=4403 00:09:28.059 12:03:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:09:28.059 12:03:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:09:28.059 12:03:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:28.060 12:03:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:09:28.060 [2024-06-11 12:03:40.951093] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:28.060 [2024-06-11 12:03:40.951179] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2692093 ] 00:09:28.060 EAL: No free 2048 kB hugepages reported on node 1 00:09:28.318 [2024-06-11 12:03:41.210020] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.318 [2024-06-11 12:03:41.236746] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:28.318 [2024-06-11 12:03:41.236924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:28.318 [2024-06-11 12:03:41.291518] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:28.318 [2024-06-11 12:03:41.307764] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:09:28.319 INFO: Running with entropic power schedule (0xFF, 100). 00:09:28.319 INFO: Seed: 1688047949 00:09:28.577 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:28.577 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:28.577 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:09:28.577 INFO: A corpus is not provided, starting from an empty corpus 00:09:28.577 #2 INITED exec/s: 0 rss: 61Mb 00:09:28.577 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:28.577 This may also happen if the target rejected all inputs we tried so far 00:09:28.836 NEW_FUNC[1/652]: 0x4a36f0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:09:28.836 NEW_FUNC[2/652]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:28.836 #12 NEW cov: 11387 ft: 11388 corp: 2/5b lim: 20 exec/s: 0 rss: 68Mb L: 4/4 MS: 5 CrossOver-ChangeBinInt-CopyPart-CopyPart-InsertByte- 00:09:28.836 [2024-06-11 12:03:41.737160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:28.836 [2024-06-11 12:03:41.737215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:28.837 NEW_FUNC[1/20]: 0x115bd90 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:09:28.837 NEW_FUNC[2/20]: 0x115c910 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:09:28.837 #15 NEW cov: 11839 ft: 12660 corp: 3/20b lim: 20 exec/s: 0 rss: 69Mb L: 15/15 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:09:28.837 #21 NEW cov: 11845 ft: 12926 corp: 4/27b lim: 20 exec/s: 0 rss: 69Mb L: 7/15 MS: 1 InsertRepeatedBytes- 00:09:28.837 [2024-06-11 12:03:41.867498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:28.837 [2024-06-11 12:03:41.867542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.096 #22 NEW cov: 11930 ft: 13222 corp: 5/42b lim: 20 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 CMP- DE: "\020\000"- 00:09:29.096 [2024-06-11 12:03:41.937511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.096 [2024-06-11 12:03:41.937551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.096 #23 NEW cov: 11931 ft: 13536 corp: 6/51b lim: 20 exec/s: 0 rss: 69Mb L: 9/15 MS: 1 EraseBytes- 00:09:29.096 [2024-06-11 12:03:41.998022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.096 [2024-06-11 12:03:41.998065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.096 #24 NEW cov: 11931 ft: 13587 corp: 7/66b lim: 20 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ShuffleBytes- 00:09:29.096 [2024-06-11 12:03:42.068588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.096 [2024-06-11 12:03:42.068625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.096 #25 NEW cov: 11948 ft: 13792 corp: 8/83b lim: 20 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 CopyPart- 00:09:29.354 [2024-06-11 12:03:42.138681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.354 [2024-06-11 12:03:42.138720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.354 #27 NEW cov: 11948 ft: 13946 corp: 9/98b lim: 20 exec/s: 0 rss: 69Mb L: 15/17 MS: 2 ChangeBit-InsertRepeatedBytes- 00:09:29.354 [2024-06-11 12:03:42.199305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.354 [2024-06-11 12:03:42.199342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.354 #28 NEW cov: 11951 ft: 14058 corp: 10/117b lim: 20 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:09:29.354 [2024-06-11 12:03:42.259395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.354 [2024-06-11 12:03:42.259433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.354 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:29.354 #29 NEW cov: 11974 ft: 14162 corp: 11/136b lim: 20 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 PersAutoDict- DE: "\020\000"- 00:09:29.354 [2024-06-11 12:03:42.329038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.354 [2024-06-11 12:03:42.329077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.355 #30 NEW cov: 11974 ft: 14204 corp: 12/147b lim: 20 exec/s: 30 rss: 69Mb L: 11/19 MS: 1 PersAutoDict- DE: "\020\000"- 00:09:29.613 #31 NEW cov: 11974 ft: 14245 corp: 13/151b lim: 20 exec/s: 31 rss: 69Mb L: 4/19 MS: 1 PersAutoDict- DE: "\020\000"- 00:09:29.613 [2024-06-11 12:03:42.460165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.613 [2024-06-11 12:03:42.460202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.613 #32 NEW cov: 11974 ft: 14278 corp: 14/170b lim: 20 exec/s: 32 rss: 70Mb L: 19/19 MS: 1 CopyPart- 00:09:29.613 [2024-06-11 12:03:42.530055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.613 [2024-06-11 12:03:42.530092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.613 #33 NEW cov: 11974 ft: 14318 corp: 15/181b lim: 20 exec/s: 33 rss: 70Mb L: 11/19 MS: 1 CrossOver- 00:09:29.613 #34 NEW cov: 11974 ft: 14336 corp: 16/185b lim: 20 exec/s: 34 rss: 70Mb L: 4/19 MS: 1 ChangeBit- 00:09:29.871 #35 NEW cov: 11974 ft: 14359 corp: 17/196b lim: 20 exec/s: 35 rss: 70Mb L: 11/19 MS: 1 PersAutoDict- DE: "\020\000"- 00:09:29.871 [2024-06-11 12:03:42.731369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.871 [2024-06-11 12:03:42.731409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.871 #36 NEW cov: 11974 ft: 14370 corp: 18/211b lim: 20 exec/s: 36 rss: 70Mb L: 15/19 MS: 1 PersAutoDict- DE: "\020\000"- 00:09:29.871 [2024-06-11 12:03:42.791612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.871 [2024-06-11 12:03:42.791651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.871 #37 NEW cov: 11974 ft: 14403 corp: 19/226b lim: 20 exec/s: 37 rss: 70Mb L: 15/19 MS: 1 ShuffleBytes- 00:09:29.871 [2024-06-11 12:03:42.861992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:29.871 [2024-06-11 12:03:42.862029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:29.871 #38 NEW cov: 11974 ft: 14413 corp: 20/245b lim: 20 exec/s: 38 rss: 70Mb L: 19/19 MS: 1 ShuffleBytes- 00:09:30.130 [2024-06-11 12:03:42.922634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.130 [2024-06-11 12:03:42.922672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.130 #39 NEW cov: 11974 ft: 14549 corp: 21/262b lim: 20 exec/s: 39 rss: 70Mb L: 17/19 MS: 1 CopyPart- 00:09:30.130 [2024-06-11 12:03:42.991999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.130 [2024-06-11 12:03:42.992035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.130 #40 NEW cov: 11974 ft: 14560 corp: 22/272b lim: 20 exec/s: 40 rss: 70Mb L: 10/19 MS: 1 CrossOver- 00:09:30.130 [2024-06-11 12:03:43.062706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.130 [2024-06-11 12:03:43.062741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:3 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.130 [2024-06-11 12:03:43.062965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.130 [2024-06-11 12:03:43.062990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:30.130 #41 NEW cov: 11976 ft: 14828 corp: 23/287b lim: 20 exec/s: 41 rss: 70Mb L: 15/19 MS: 1 ChangeBinInt- 00:09:30.130 [2024-06-11 12:03:43.123163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.130 [2024-06-11 12:03:43.123199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.389 #42 NEW cov: 11976 ft: 14860 corp: 24/303b lim: 20 exec/s: 42 rss: 70Mb L: 16/19 MS: 1 InsertByte- 00:09:30.389 [2024-06-11 12:03:43.193662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.389 [2024-06-11 12:03:43.193699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:30.389 #43 NEW cov: 11976 ft: 14883 corp: 25/320b lim: 20 exec/s: 43 rss: 70Mb L: 17/19 MS: 1 InsertByte- 00:09:30.389 [2024-06-11 12:03:43.263640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:09:30.389 [2024-06-11 12:03:43.263678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:30.389 #44 NEW cov: 11976 ft: 14895 corp: 26/337b lim: 20 exec/s: 44 rss: 70Mb L: 17/19 MS: 1 ChangeByte- 00:09:30.389 #45 NEW cov: 11976 ft: 14911 corp: 27/343b lim: 20 exec/s: 22 rss: 70Mb L: 6/19 MS: 1 EraseBytes- 00:09:30.389 #45 DONE cov: 11976 ft: 14911 corp: 27/343b lim: 20 exec/s: 22 rss: 70Mb 00:09:30.389 ###### Recommended dictionary. ###### 00:09:30.389 "\020\000" # Uses: 5 00:09:30.389 ###### End of recommended dictionary. ###### 00:09:30.389 Done 45 runs in 2 second(s) 00:09:30.649 12:03:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:09:30.649 12:03:43 -- ../common.sh@72 -- # (( i++ )) 00:09:30.649 12:03:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:30.649 12:03:43 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:30.649 12:03:43 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:09:30.649 12:03:43 -- nvmf/run.sh@24 -- # local timen=1 00:09:30.649 12:03:43 -- nvmf/run.sh@25 -- # local core=0x1 00:09:30.649 12:03:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:09:30.649 12:03:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:09:30.649 12:03:43 -- nvmf/run.sh@29 -- # printf %02d 4 00:09:30.649 12:03:43 -- nvmf/run.sh@29 -- # port=4404 00:09:30.649 12:03:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:09:30.649 12:03:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:09:30.649 12:03:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:30.649 12:03:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:09:30.649 [2024-06-11 12:03:43.527841] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:30.649 [2024-06-11 12:03:43.527910] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2692454 ] 00:09:30.649 EAL: No free 2048 kB hugepages reported on node 1 00:09:30.907 [2024-06-11 12:03:43.776581] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.907 [2024-06-11 12:03:43.803066] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:30.907 [2024-06-11 12:03:43.803240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.907 [2024-06-11 12:03:43.857802] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:30.907 [2024-06-11 12:03:43.874045] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:09:30.907 INFO: Running with entropic power schedule (0xFF, 100). 00:09:30.907 INFO: Seed: 4253044352 00:09:30.907 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:30.907 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:30.907 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:09:30.908 INFO: A corpus is not provided, starting from an empty corpus 00:09:30.908 #2 INITED exec/s: 0 rss: 61Mb 00:09:30.908 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:30.908 This may also happen if the target rejected all inputs we tried so far 00:09:31.166 [2024-06-11 12:03:43.945196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.166 [2024-06-11 12:03:43.945250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.424 NEW_FUNC[1/664]: 0x4a47e0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:09:31.425 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:31.425 #13 NEW cov: 11510 ft: 11511 corp: 2/12b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:09:31.425 [2024-06-11 12:03:44.425964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d90a0ad9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.425 [2024-06-11 12:03:44.426015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.684 #14 NEW cov: 11623 ft: 12079 corp: 3/24b lim: 35 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 CrossOver- 00:09:31.684 [2024-06-11 12:03:44.496973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.497012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.497112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.497134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.497232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d9d96bd9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.497254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:31.684 #20 NEW cov: 11629 ft: 13020 corp: 4/46b lim: 35 exec/s: 0 rss: 69Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:09:31.684 [2024-06-11 12:03:44.566470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.566508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.684 #21 NEW cov: 11714 ft: 13216 corp: 5/57b lim: 35 exec/s: 0 rss: 69Mb L: 11/22 MS: 1 CrossOver- 00:09:31.684 [2024-06-11 12:03:44.628027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.628064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.628163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.628186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.628277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d9d96bd9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.628298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.628399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.628426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:31.684 #27 NEW cov: 11714 ft: 13656 corp: 6/89b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 CrossOver- 00:09:31.684 [2024-06-11 12:03:44.698142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:316b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.698179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.698274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.698298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:31.684 [2024-06-11 12:03:44.698399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d9d96bd9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.684 [2024-06-11 12:03:44.698420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:31.943 #28 NEW cov: 11714 ft: 13729 corp: 7/111b lim: 35 exec/s: 0 rss: 69Mb L: 22/32 MS: 1 ChangeByte- 00:09:31.943 [2024-06-11 12:03:44.758278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.943 [2024-06-11 12:03:44.758313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.943 [2024-06-11 12:03:44.758414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:d9d9d9d9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.943 [2024-06-11 12:03:44.758437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:31.943 #29 NEW cov: 11714 ft: 13978 corp: 8/127b lim: 35 exec/s: 0 rss: 69Mb L: 16/32 MS: 1 CrossOver- 00:09:31.943 [2024-06-11 12:03:44.818175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.943 [2024-06-11 12:03:44.818211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.943 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:31.943 #30 NEW cov: 11737 ft: 14076 corp: 9/134b lim: 35 exec/s: 0 rss: 69Mb L: 7/32 MS: 1 EraseBytes- 00:09:31.943 [2024-06-11 12:03:44.888818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:d9000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.943 [2024-06-11 12:03:44.888853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:31.943 [2024-06-11 12:03:44.888948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:d9d900f2 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.943 [2024-06-11 12:03:44.888970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:31.943 #36 NEW cov: 11737 ft: 14095 corp: 10/149b lim: 35 exec/s: 36 rss: 69Mb L: 15/32 MS: 1 CMP- DE: "\000\000\000\362"- 00:09:31.943 [2024-06-11 12:03:44.948910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.944 [2024-06-11 12:03:44.948947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.202 #37 NEW cov: 11737 ft: 14126 corp: 11/157b lim: 35 exec/s: 37 rss: 69Mb L: 8/32 MS: 1 EraseBytes- 00:09:32.202 [2024-06-11 12:03:45.020140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.020176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.202 [2024-06-11 12:03:45.020276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.020297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.202 [2024-06-11 12:03:45.020386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d9d96bd9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.020411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.202 #38 NEW cov: 11737 ft: 14130 corp: 12/179b lim: 35 exec/s: 38 rss: 69Mb L: 22/32 MS: 1 ShuffleBytes- 00:09:32.202 [2024-06-11 12:03:45.079569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:d9ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.079605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.202 #39 NEW cov: 11737 ft: 14174 corp: 13/191b lim: 35 exec/s: 39 rss: 69Mb L: 12/32 MS: 1 CMP- DE: "\377\377\377\377"- 00:09:32.202 [2024-06-11 12:03:45.149870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.149907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.202 #40 NEW cov: 11737 ft: 14213 corp: 14/198b lim: 35 exec/s: 40 rss: 70Mb L: 7/32 MS: 1 ShuffleBytes- 00:09:32.202 [2024-06-11 12:03:45.220391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9f80ad9 cdw11:d9000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.220427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.202 [2024-06-11 12:03:45.220520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:d9d900f2 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.202 [2024-06-11 12:03:45.220543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.461 #41 NEW cov: 11737 ft: 14232 corp: 15/213b lim: 35 exec/s: 41 rss: 70Mb L: 15/32 MS: 1 ChangeByte- 00:09:32.461 [2024-06-11 12:03:45.281194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.281231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.461 [2024-06-11 12:03:45.281330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:eb6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.281353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.461 [2024-06-11 12:03:45.281459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d9d96bd9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.281480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.461 #42 NEW cov: 11737 ft: 14261 corp: 16/235b lim: 35 exec/s: 42 rss: 70Mb L: 22/32 MS: 1 ChangeBit- 00:09:32.461 [2024-06-11 12:03:45.352205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9250ad9 cdw11:d96b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.352240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.461 [2024-06-11 12:03:45.352336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.352365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.461 [2024-06-11 12:03:45.352458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d9d96b6b cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.352478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.461 [2024-06-11 12:03:45.352569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:6b6bd96b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.352590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.461 #43 NEW cov: 11737 ft: 14284 corp: 17/268b lim: 35 exec/s: 43 rss: 70Mb L: 33/33 MS: 1 InsertByte- 00:09:32.461 [2024-06-11 12:03:45.421620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.421657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.461 #44 NEW cov: 11737 ft: 14324 corp: 18/275b lim: 35 exec/s: 44 rss: 70Mb L: 7/33 MS: 1 ShuffleBytes- 00:09:32.461 [2024-06-11 12:03:45.492073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d90a0a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.461 [2024-06-11 12:03:45.492108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.720 #45 NEW cov: 11737 ft: 14403 corp: 19/282b lim: 35 exec/s: 45 rss: 70Mb L: 7/33 MS: 1 CopyPart- 00:09:32.720 [2024-06-11 12:03:45.562612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:21d90ad9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.720 [2024-06-11 12:03:45.562647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.720 #46 NEW cov: 11737 ft: 14419 corp: 20/295b lim: 35 exec/s: 46 rss: 70Mb L: 13/33 MS: 1 InsertByte- 00:09:32.720 [2024-06-11 12:03:45.632999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0ad90a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.720 [2024-06-11 12:03:45.633035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.720 #47 NEW cov: 11737 ft: 14489 corp: 21/302b lim: 35 exec/s: 47 rss: 70Mb L: 7/33 MS: 1 ShuffleBytes- 00:09:32.720 [2024-06-11 12:03:45.704293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:316b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.720 [2024-06-11 12:03:45.704328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.720 [2024-06-11 12:03:45.704430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.720 [2024-06-11 12:03:45.704453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.720 [2024-06-11 12:03:45.704541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:d90a6bd9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.720 [2024-06-11 12:03:45.704562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.720 #48 NEW cov: 11737 ft: 14492 corp: 22/324b lim: 35 exec/s: 48 rss: 70Mb L: 22/33 MS: 1 CopyPart- 00:09:32.979 [2024-06-11 12:03:45.775205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:316b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.775240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.979 [2024-06-11 12:03:45.775330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6b6b6b6b cdw11:6b890001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.775352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.979 [2024-06-11 12:03:45.775456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:6b6b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.775477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:32.979 [2024-06-11 12:03:45.775569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:d9d9d9d9 cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.775591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:32.979 #49 NEW cov: 11737 ft: 14497 corp: 23/352b lim: 35 exec/s: 49 rss: 70Mb L: 28/33 MS: 1 InsertRepeatedBytes- 00:09:32.979 [2024-06-11 12:03:45.834292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90ad9 cdw11:d9ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.834327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.979 #50 NEW cov: 11737 ft: 14565 corp: 24/364b lim: 35 exec/s: 50 rss: 70Mb L: 12/33 MS: 1 ChangeBinInt- 00:09:32.979 [2024-06-11 12:03:45.895160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d9d90a0a cdw11:d9d90003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.895195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:32.979 [2024-06-11 12:03:45.895293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:74747474 cdw11:74740002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:32.979 [2024-06-11 12:03:45.895315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:32.979 #51 NEW cov: 11737 ft: 14571 corp: 25/382b lim: 35 exec/s: 25 rss: 70Mb L: 18/33 MS: 1 InsertRepeatedBytes- 00:09:32.979 #51 DONE cov: 11737 ft: 14571 corp: 25/382b lim: 35 exec/s: 25 rss: 70Mb 00:09:32.979 ###### Recommended dictionary. ###### 00:09:32.979 "\000\000\000\362" # Uses: 0 00:09:32.979 "\377\377\377\377" # Uses: 0 00:09:32.979 ###### End of recommended dictionary. ###### 00:09:32.979 Done 51 runs in 2 second(s) 00:09:33.238 12:03:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:09:33.238 12:03:46 -- ../common.sh@72 -- # (( i++ )) 00:09:33.238 12:03:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:33.238 12:03:46 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:33.238 12:03:46 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:09:33.238 12:03:46 -- nvmf/run.sh@24 -- # local timen=1 00:09:33.238 12:03:46 -- nvmf/run.sh@25 -- # local core=0x1 00:09:33.238 12:03:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:09:33.238 12:03:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:09:33.238 12:03:46 -- nvmf/run.sh@29 -- # printf %02d 5 00:09:33.238 12:03:46 -- nvmf/run.sh@29 -- # port=4405 00:09:33.238 12:03:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:09:33.238 12:03:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:09:33.238 12:03:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:33.238 12:03:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:09:33.239 [2024-06-11 12:03:46.076333] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:33.239 [2024-06-11 12:03:46.076394] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2692823 ] 00:09:33.239 EAL: No free 2048 kB hugepages reported on node 1 00:09:33.496 [2024-06-11 12:03:46.286366] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.496 [2024-06-11 12:03:46.312909] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:33.496 [2024-06-11 12:03:46.313081] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.496 [2024-06-11 12:03:46.367554] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:33.496 [2024-06-11 12:03:46.383795] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:09:33.496 INFO: Running with entropic power schedule (0xFF, 100). 00:09:33.496 INFO: Seed: 2468069265 00:09:33.496 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:33.496 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:33.496 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:09:33.496 INFO: A corpus is not provided, starting from an empty corpus 00:09:33.496 #2 INITED exec/s: 0 rss: 61Mb 00:09:33.496 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:33.496 This may also happen if the target rejected all inputs we tried so far 00:09:33.496 [2024-06-11 12:03:46.433592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.496 [2024-06-11 12:03:46.433631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:33.496 [2024-06-11 12:03:46.433697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.496 [2024-06-11 12:03:46.433717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:33.496 [2024-06-11 12:03:46.433779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.496 [2024-06-11 12:03:46.433798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:33.496 [2024-06-11 12:03:46.433861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:33.496 [2024-06-11 12:03:46.433880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.064 NEW_FUNC[1/664]: 0x4a6970 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:09:34.064 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:34.064 #3 NEW cov: 11521 ft: 11522 corp: 2/43b lim: 45 exec/s: 0 rss: 68Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:09:34.065 [2024-06-11 12:03:46.904152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:46.904197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.065 #13 NEW cov: 11634 ft: 12873 corp: 3/52b lim: 45 exec/s: 0 rss: 68Mb L: 9/42 MS: 5 CopyPart-ShuffleBytes-ChangeByte-CopyPart-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:09:34.065 [2024-06-11 12:03:46.964234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:46.964270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.065 #15 NEW cov: 11640 ft: 13067 corp: 4/63b lim: 45 exec/s: 0 rss: 68Mb L: 11/42 MS: 2 CopyPart-CrossOver- 00:09:34.065 [2024-06-11 12:03:47.014853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:47.014891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.065 [2024-06-11 12:03:47.014956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:47.014975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.065 [2024-06-11 12:03:47.015038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:47.015057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.065 [2024-06-11 12:03:47.015119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:47.015137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.065 #16 NEW cov: 11725 ft: 13248 corp: 5/105b lim: 45 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 ChangeByte- 00:09:34.065 [2024-06-11 12:03:47.074542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.065 [2024-06-11 12:03:47.074577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.324 #18 NEW cov: 11725 ft: 13366 corp: 6/114b lim: 45 exec/s: 0 rss: 69Mb L: 9/42 MS: 2 ChangeBit-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:34.324 [2024-06-11 12:03:47.125172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.324 [2024-06-11 12:03:47.125206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.125272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d401d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.125292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.125357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.125383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.125446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d400d4d4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.125464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.325 #19 NEW cov: 11725 ft: 13421 corp: 7/150b lim: 45 exec/s: 0 rss: 69Mb L: 36/42 MS: 1 CrossOver- 00:09:34.325 [2024-06-11 12:03:47.175368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.175405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.175470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d401d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.175489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.175552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.175575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.175637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d400d4d4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.175656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.325 #20 NEW cov: 11725 ft: 13466 corp: 8/186b lim: 45 exec/s: 0 rss: 69Mb L: 36/42 MS: 1 ChangeBinInt- 00:09:34.325 [2024-06-11 12:03:47.235506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.235540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.235606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:010affff cdw11:adee0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.235626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.235690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.235708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.235772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.235790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.325 #21 NEW cov: 11725 ft: 13522 corp: 9/230b lim: 45 exec/s: 0 rss: 69Mb L: 44/44 MS: 1 CMP- DE: "\377\377\377\377\001\012\255\356"- 00:09:34.325 [2024-06-11 12:03:47.285620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.285653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.285718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.285737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.285801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.285819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.285880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00d40000 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.285899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.325 #22 NEW cov: 11725 ft: 13537 corp: 10/272b lim: 45 exec/s: 0 rss: 69Mb L: 42/44 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:34.325 [2024-06-11 12:03:47.335802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.335835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.335905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.335925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.335987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.336005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.325 [2024-06-11 12:03:47.336067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00d40000 cdw11:d4d40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.325 [2024-06-11 12:03:47.336085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.584 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:34.584 #23 NEW cov: 11748 ft: 13582 corp: 11/314b lim: 45 exec/s: 0 rss: 69Mb L: 42/44 MS: 1 CopyPart- 00:09:34.584 [2024-06-11 12:03:47.395913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0ad4 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.395946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.396012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d401eed4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.396031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.396093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.396112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.396175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.396194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.584 #24 NEW cov: 11748 ft: 13592 corp: 12/358b lim: 45 exec/s: 24 rss: 69Mb L: 44/44 MS: 1 CopyPart- 00:09:34.584 [2024-06-11 12:03:47.456111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:20202020 cdw11:20200001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.456145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.456208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.456228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.456292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.456311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.456374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.456393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.584 #25 NEW cov: 11748 ft: 13783 corp: 13/402b lim: 45 exec/s: 25 rss: 69Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:09:34.584 [2024-06-11 12:03:47.506281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.506315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.506387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.506406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.506469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.506488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.584 [2024-06-11 12:03:47.506551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.506569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.584 #26 NEW cov: 11748 ft: 13803 corp: 14/445b lim: 45 exec/s: 26 rss: 69Mb L: 43/44 MS: 1 InsertByte- 00:09:34.584 [2024-06-11 12:03:47.565883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a0a cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.584 [2024-06-11 12:03:47.565917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.584 #29 NEW cov: 11748 ft: 13816 corp: 15/456b lim: 45 exec/s: 29 rss: 69Mb L: 11/44 MS: 3 CrossOver-CrossOver-PersAutoDict- DE: "\377\377\377\377\001\012\255\356"- 00:09:34.585 [2024-06-11 12:03:47.616072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.585 [2024-06-11 12:03:47.616105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.844 #30 NEW cov: 11748 ft: 13826 corp: 16/465b lim: 45 exec/s: 30 rss: 69Mb L: 9/44 MS: 1 ShuffleBytes- 00:09:34.844 [2024-06-11 12:03:47.676223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.676256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.844 #31 NEW cov: 11748 ft: 13844 corp: 17/474b lim: 45 exec/s: 31 rss: 70Mb L: 9/44 MS: 1 CopyPart- 00:09:34.844 [2024-06-11 12:03:47.736961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.736995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.737059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.737078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.737143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.737161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.737225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.737247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.844 #32 NEW cov: 11748 ft: 13889 corp: 18/517b lim: 45 exec/s: 32 rss: 70Mb L: 43/44 MS: 1 ChangeBinInt- 00:09:34.844 [2024-06-11 12:03:47.797119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.797155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.797221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00d40000 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.797241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.797306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.797324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.797394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d400d4d4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.797414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.844 #33 NEW cov: 11748 ft: 13902 corp: 19/553b lim: 45 exec/s: 33 rss: 70Mb L: 36/44 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:09:34.844 [2024-06-11 12:03:47.857455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0ad4 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.857490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.857556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d401eed4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.857575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.857637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.857656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.857718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.857736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:34.844 [2024-06-11 12:03:47.857797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:d400d4d4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:34.844 [2024-06-11 12:03:47.857815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:35.102 #34 NEW cov: 11748 ft: 13978 corp: 20/598b lim: 45 exec/s: 34 rss: 70Mb L: 45/45 MS: 1 CopyPart- 00:09:35.102 [2024-06-11 12:03:47.917442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.102 [2024-06-11 12:03:47.917476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.102 [2024-06-11 12:03:47.917546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.102 [2024-06-11 12:03:47.917565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.102 [2024-06-11 12:03:47.917628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d400d4d4 cdw11:2bd40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.102 [2024-06-11 12:03:47.917647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:47.917712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:47.917731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.103 #35 NEW cov: 11748 ft: 14023 corp: 21/637b lim: 45 exec/s: 35 rss: 70Mb L: 39/45 MS: 1 EraseBytes- 00:09:35.103 [2024-06-11 12:03:47.977113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:47.977148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.103 #36 NEW cov: 11748 ft: 14030 corp: 22/646b lim: 45 exec/s: 36 rss: 70Mb L: 9/45 MS: 1 ChangeBit- 00:09:35.103 [2024-06-11 12:03:48.027764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.027799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:48.027864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.027884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:48.027950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.027969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:48.028032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:32d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.028051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.103 #37 NEW cov: 11748 ft: 14058 corp: 23/689b lim: 45 exec/s: 37 rss: 70Mb L: 43/45 MS: 1 ChangeBinInt- 00:09:35.103 [2024-06-11 12:03:48.077926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.077960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:48.078028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.078047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:48.078111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.078130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.103 [2024-06-11 12:03:48.078194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4322bd4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.103 [2024-06-11 12:03:48.078217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.103 #38 NEW cov: 11748 ft: 14069 corp: 24/733b lim: 45 exec/s: 38 rss: 70Mb L: 44/45 MS: 1 CrossOver- 00:09:35.363 [2024-06-11 12:03:48.138112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:01d40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.138146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.138212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.138232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.138295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d40000 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.138313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.138381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.138400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.363 #39 NEW cov: 11748 ft: 14150 corp: 25/777b lim: 45 exec/s: 39 rss: 70Mb L: 44/45 MS: 1 CopyPart- 00:09:35.363 [2024-06-11 12:03:48.198281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0ad4 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.198315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.198383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d401eed4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.198404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.198468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.198486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.198551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.198569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.363 #40 NEW cov: 11748 ft: 14233 corp: 26/818b lim: 45 exec/s: 40 rss: 70Mb L: 41/45 MS: 1 EraseBytes- 00:09:35.363 [2024-06-11 12:03:48.258473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.258507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.258572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0041d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.258592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.258656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.258678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.258742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:d4d42bd4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.258760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.363 #41 NEW cov: 11748 ft: 14244 corp: 27/862b lim: 45 exec/s: 41 rss: 70Mb L: 44/45 MS: 1 CrossOver- 00:09:35.363 [2024-06-11 12:03:48.298592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.298625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.298690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:f1f1f1f1 cdw11:f1f10007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.298709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.298773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:f1f1f1f1 cdw11:f1f10007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.298792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.298855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:f1f1f1f1 cdw11:f1f10007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.298874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.363 #42 NEW cov: 11748 ft: 14268 corp: 28/904b lim: 45 exec/s: 42 rss: 70Mb L: 42/45 MS: 1 InsertRepeatedBytes- 00:09:35.363 [2024-06-11 12:03:48.358754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4d40ad4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.358787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.358855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:d4d4d4d4 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.358874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:35.363 [2024-06-11 12:03:48.358939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:d4d43ad4 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.363 [2024-06-11 12:03:48.358958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:35.364 [2024-06-11 12:03:48.359021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00d40000 cdw11:d4d40006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.364 [2024-06-11 12:03:48.359040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:35.364 #43 NEW cov: 11748 ft: 14322 corp: 29/946b lim: 45 exec/s: 43 rss: 70Mb L: 42/45 MS: 1 ChangeByte- 00:09:35.622 [2024-06-11 12:03:48.408342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00002100 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:35.622 [2024-06-11 12:03:48.408380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:35.622 #44 NEW cov: 11748 ft: 14345 corp: 30/955b lim: 45 exec/s: 22 rss: 70Mb L: 9/45 MS: 1 ChangeBit- 00:09:35.622 #44 DONE cov: 11748 ft: 14345 corp: 30/955b lim: 45 exec/s: 22 rss: 70Mb 00:09:35.622 ###### Recommended dictionary. ###### 00:09:35.622 "\001\000\000\000\000\000\000\000" # Uses: 3 00:09:35.622 "\377\377\377\377\001\012\255\356" # Uses: 1 00:09:35.622 ###### End of recommended dictionary. ###### 00:09:35.622 Done 44 runs in 2 second(s) 00:09:35.622 12:03:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:09:35.622 12:03:48 -- ../common.sh@72 -- # (( i++ )) 00:09:35.622 12:03:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:35.622 12:03:48 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:35.622 12:03:48 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:09:35.622 12:03:48 -- nvmf/run.sh@24 -- # local timen=1 00:09:35.622 12:03:48 -- nvmf/run.sh@25 -- # local core=0x1 00:09:35.622 12:03:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:09:35.622 12:03:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:09:35.622 12:03:48 -- nvmf/run.sh@29 -- # printf %02d 6 00:09:35.622 12:03:48 -- nvmf/run.sh@29 -- # port=4406 00:09:35.622 12:03:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:09:35.622 12:03:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:09:35.622 12:03:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:35.622 12:03:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:09:35.622 [2024-06-11 12:03:48.621905] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:35.622 [2024-06-11 12:03:48.621995] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2693184 ] 00:09:35.881 EAL: No free 2048 kB hugepages reported on node 1 00:09:35.881 [2024-06-11 12:03:48.859400] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.881 [2024-06-11 12:03:48.885789] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:35.881 [2024-06-11 12:03:48.885966] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.140 [2024-06-11 12:03:48.940415] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:36.140 [2024-06-11 12:03:48.956667] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:09:36.140 INFO: Running with entropic power schedule (0xFF, 100). 00:09:36.140 INFO: Seed: 748102280 00:09:36.140 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:36.140 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:36.140 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:09:36.140 INFO: A corpus is not provided, starting from an empty corpus 00:09:36.140 #2 INITED exec/s: 0 rss: 61Mb 00:09:36.140 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:36.140 This may also happen if the target rejected all inputs we tried so far 00:09:36.140 [2024-06-11 12:03:49.011790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:36.140 [2024-06-11 12:03:49.011839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.707 NEW_FUNC[1/662]: 0x4a9180 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:09:36.707 NEW_FUNC[2/662]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:36.707 #3 NEW cov: 11438 ft: 11436 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:09:36.707 [2024-06-11 12:03:49.513056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:36.707 [2024-06-11 12:03:49.513113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.707 #8 NEW cov: 11551 ft: 11853 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 5 ChangeBit-ChangeBinInt-ChangeBit-CrossOver-InsertByte- 00:09:36.707 [2024-06-11 12:03:49.583086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000505 cdw11:00000000 00:09:36.707 [2024-06-11 12:03:49.583130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.707 #10 NEW cov: 11557 ft: 12179 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 2 ChangeBinInt-CopyPart- 00:09:36.707 [2024-06-11 12:03:49.653288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:36.707 [2024-06-11 12:03:49.653330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.707 #11 NEW cov: 11642 ft: 12421 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:09:36.966 [2024-06-11 12:03:49.743828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5ff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.743869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.743916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.743939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.743981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.744003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.744044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.744066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.744107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff0c cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.744130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:36.966 #12 NEW cov: 11642 ft: 12838 corp: 6/19b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:09:36.966 [2024-06-11 12:03:49.823889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5ff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.823931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.823977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.824001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.824042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.824066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:36.966 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:36.966 #13 NEW cov: 11665 ft: 13053 corp: 7/26b lim: 10 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 EraseBytes- 00:09:36.966 [2024-06-11 12:03:49.914238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.914280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.914326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.914349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.914404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.914427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.914468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.914491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:36.966 [2024-06-11 12:03:49.914532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.914554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:36.966 #14 NEW cov: 11665 ft: 13126 corp: 8/36b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:09:36.966 [2024-06-11 12:03:49.984796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:36.966 [2024-06-11 12:03:49.984831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.225 #15 NEW cov: 11665 ft: 13322 corp: 9/38b lim: 10 exec/s: 15 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:09:37.225 [2024-06-11 12:03:50.035518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5ff cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.035555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.225 [2024-06-11 12:03:50.035624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.035645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.225 [2024-06-11 12:03:50.035711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.035730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.225 [2024-06-11 12:03:50.035795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff27 cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.035814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.225 [2024-06-11 12:03:50.035879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ff0c cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.035899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:37.225 #16 NEW cov: 11665 ft: 13359 corp: 10/48b lim: 10 exec/s: 16 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:09:37.225 [2024-06-11 12:03:50.085094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.085131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.225 #17 NEW cov: 11665 ft: 13455 corp: 11/50b lim: 10 exec/s: 17 rss: 69Mb L: 2/10 MS: 1 CMP- DE: "\377\377"- 00:09:37.225 [2024-06-11 12:03:50.125340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.125381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.225 [2024-06-11 12:03:50.125447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.125466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.225 #18 NEW cov: 11665 ft: 13643 corp: 12/54b lim: 10 exec/s: 18 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:09:37.225 [2024-06-11 12:03:50.185506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.185541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.225 [2024-06-11 12:03:50.185609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.185628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.225 #19 NEW cov: 11665 ft: 13660 corp: 13/58b lim: 10 exec/s: 19 rss: 69Mb L: 4/10 MS: 1 PersAutoDict- DE: "\377\377"- 00:09:37.225 [2024-06-11 12:03:50.235490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:37.225 [2024-06-11 12:03:50.235524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.484 #20 NEW cov: 11665 ft: 13742 corp: 14/60b lim: 10 exec/s: 20 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:09:37.484 [2024-06-11 12:03:50.296176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.296210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.484 [2024-06-11 12:03:50.296279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.296298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.484 [2024-06-11 12:03:50.296366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.296386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.484 [2024-06-11 12:03:50.296455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002000 cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.296474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.484 [2024-06-11 12:03:50.296540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.296559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:37.484 #21 NEW cov: 11665 ft: 13754 corp: 15/70b lim: 10 exec/s: 21 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:09:37.484 [2024-06-11 12:03:50.355771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.355807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.484 #23 NEW cov: 11665 ft: 13787 corp: 16/73b lim: 10 exec/s: 23 rss: 69Mb L: 3/10 MS: 2 ShuffleBytes-PersAutoDict- DE: "\377\377"- 00:09:37.484 [2024-06-11 12:03:50.406027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.406066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.484 [2024-06-11 12:03:50.406132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.406152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.484 #24 NEW cov: 11665 ft: 13810 corp: 17/77b lim: 10 exec/s: 24 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:09:37.484 [2024-06-11 12:03:50.456070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a0a cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.456103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.484 #25 NEW cov: 11665 ft: 13854 corp: 18/79b lim: 10 exec/s: 25 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:09:37.484 [2024-06-11 12:03:50.506179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:37.484 [2024-06-11 12:03:50.506213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.743 #26 NEW cov: 11665 ft: 13914 corp: 19/81b lim: 10 exec/s: 26 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:09:37.743 [2024-06-11 12:03:50.556933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5ff cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.556967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.557034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.557054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.557117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.557136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.557203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000000d8 cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.557222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.557289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:000000f3 cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.557308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:37.743 #27 NEW cov: 11665 ft: 13952 corp: 20/91b lim: 10 exec/s: 27 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:09:37.743 [2024-06-11 12:03:50.616489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.616523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.743 #28 NEW cov: 11665 ft: 13976 corp: 21/93b lim: 10 exec/s: 28 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:09:37.743 [2024-06-11 12:03:50.656622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a0a cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.656656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.743 #29 NEW cov: 11665 ft: 13990 corp: 22/96b lim: 10 exec/s: 29 rss: 69Mb L: 3/10 MS: 1 CrossOver- 00:09:37.743 [2024-06-11 12:03:50.717256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5ff cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.717291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.717369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.717389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.717453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.717472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:37.743 [2024-06-11 12:03:50.717538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:09:37.743 [2024-06-11 12:03:50.717556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:37.743 #30 NEW cov: 11665 ft: 14014 corp: 23/105b lim: 10 exec/s: 30 rss: 69Mb L: 9/10 MS: 1 CopyPart- 00:09:38.002 [2024-06-11 12:03:50.777436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5ff cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.777471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.777538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.777557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.777624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.777643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.777708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.777726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:38.002 #31 NEW cov: 11665 ft: 14070 corp: 24/114b lim: 10 exec/s: 31 rss: 69Mb L: 9/10 MS: 1 CrossOver- 00:09:38.002 [2024-06-11 12:03:50.837166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fd0c cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.837200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.002 #32 NEW cov: 11665 ft: 14074 corp: 25/116b lim: 10 exec/s: 32 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:09:38.002 [2024-06-11 12:03:50.877937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.877971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.878038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.878057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.878123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008000 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.878142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.878206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002000 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.878225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:38.002 [2024-06-11 12:03:50.878290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000f50c cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.878311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:38.002 #33 NEW cov: 11665 ft: 14115 corp: 26/126b lim: 10 exec/s: 33 rss: 70Mb L: 10/10 MS: 1 ChangeBit- 00:09:38.002 [2024-06-11 12:03:50.937545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000feff cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.937579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.002 #34 NEW cov: 11665 ft: 14148 corp: 27/128b lim: 10 exec/s: 34 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:09:38.002 [2024-06-11 12:03:50.997665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000505 cdw11:00000000 00:09:38.002 [2024-06-11 12:03:50.997698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:38.261 #35 NEW cov: 11665 ft: 14160 corp: 28/130b lim: 10 exec/s: 17 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:09:38.261 #35 DONE cov: 11665 ft: 14160 corp: 28/130b lim: 10 exec/s: 17 rss: 70Mb 00:09:38.261 ###### Recommended dictionary. ###### 00:09:38.261 "\377\377" # Uses: 2 00:09:38.261 ###### End of recommended dictionary. ###### 00:09:38.261 Done 35 runs in 2 second(s) 00:09:38.261 12:03:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:09:38.261 12:03:51 -- ../common.sh@72 -- # (( i++ )) 00:09:38.261 12:03:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:38.261 12:03:51 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:09:38.261 12:03:51 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:09:38.261 12:03:51 -- nvmf/run.sh@24 -- # local timen=1 00:09:38.261 12:03:51 -- nvmf/run.sh@25 -- # local core=0x1 00:09:38.261 12:03:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:09:38.261 12:03:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:09:38.261 12:03:51 -- nvmf/run.sh@29 -- # printf %02d 7 00:09:38.261 12:03:51 -- nvmf/run.sh@29 -- # port=4407 00:09:38.261 12:03:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:09:38.261 12:03:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:09:38.261 12:03:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:38.261 12:03:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:09:38.261 [2024-06-11 12:03:51.216369] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:38.261 [2024-06-11 12:03:51.216441] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2693546 ] 00:09:38.261 EAL: No free 2048 kB hugepages reported on node 1 00:09:38.520 [2024-06-11 12:03:51.458180] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:38.520 [2024-06-11 12:03:51.484297] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:38.520 [2024-06-11 12:03:51.484476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:38.520 [2024-06-11 12:03:51.538915] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:38.779 [2024-06-11 12:03:51.555151] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:09:38.779 INFO: Running with entropic power schedule (0xFF, 100). 00:09:38.779 INFO: Seed: 3346101990 00:09:38.779 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:38.779 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:38.779 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:09:38.779 INFO: A corpus is not provided, starting from an empty corpus 00:09:38.779 #2 INITED exec/s: 0 rss: 61Mb 00:09:38.779 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:38.779 This may also happen if the target rejected all inputs we tried so far 00:09:38.779 [2024-06-11 12:03:51.610787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:38.779 [2024-06-11 12:03:51.610826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.037 NEW_FUNC[1/662]: 0x4a9b70 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:09:39.037 NEW_FUNC[2/662]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:39.037 #3 NEW cov: 11434 ft: 11428 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:09:39.297 [2024-06-11 12:03:52.082219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.082266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.082334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.082354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.082424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.082444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.297 #4 NEW cov: 11551 ft: 12225 corp: 3/9b lim: 10 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:09:39.297 [2024-06-11 12:03:52.132373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0c cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.132409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.132474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.132494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.132557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.132575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.132637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.132655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:39.297 #5 NEW cov: 11557 ft: 12644 corp: 4/18b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:09:39.297 [2024-06-11 12:03:52.192499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0c cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.192535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.192600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.192620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.192683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.192706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.192771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.192790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:39.297 #6 NEW cov: 11642 ft: 12872 corp: 5/27b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 CrossOver- 00:09:39.297 [2024-06-11 12:03:52.252682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.252715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.252779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000c00 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.252799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.252860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.252879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.252941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.252960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:39.297 #10 NEW cov: 11642 ft: 13005 corp: 6/36b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 4 ShuffleBytes-ChangeByte-ShuffleBytes-CrossOver- 00:09:39.297 [2024-06-11 12:03:52.302750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.302783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.302848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002500 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.302868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.297 [2024-06-11 12:03:52.302934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.297 [2024-06-11 12:03:52.302952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.556 #11 NEW cov: 11642 ft: 13105 corp: 7/43b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 InsertByte- 00:09:39.556 [2024-06-11 12:03:52.352814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:39.556 [2024-06-11 12:03:52.352847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.556 [2024-06-11 12:03:52.352912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002500 cdw11:00000000 00:09:39.556 [2024-06-11 12:03:52.352932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.556 [2024-06-11 12:03:52.352993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.556 [2024-06-11 12:03:52.353012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.556 #12 NEW cov: 11642 ft: 13160 corp: 8/50b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 CopyPart- 00:09:39.556 [2024-06-11 12:03:52.413033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0c cdw11:00000000 00:09:39.556 [2024-06-11 12:03:52.413071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.556 [2024-06-11 12:03:52.413136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:39.556 [2024-06-11 12:03:52.413156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.556 [2024-06-11 12:03:52.413217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.556 [2024-06-11 12:03:52.413236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.556 #13 NEW cov: 11642 ft: 13222 corp: 9/56b lim: 10 exec/s: 0 rss: 69Mb L: 6/9 MS: 1 EraseBytes- 00:09:39.557 [2024-06-11 12:03:52.473335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000ac8 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.473374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.557 [2024-06-11 12:03:52.473436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000c8c8 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.473456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.557 [2024-06-11 12:03:52.473518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000c8c8 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.473536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.557 [2024-06-11 12:03:52.473598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c8c8 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.473617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:39.557 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:39.557 #15 NEW cov: 11665 ft: 13272 corp: 10/65b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 2 EraseBytes-InsertRepeatedBytes- 00:09:39.557 [2024-06-11 12:03:52.533372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ecf3 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.533407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.557 [2024-06-11 12:03:52.533473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f3f3 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.533493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.557 [2024-06-11 12:03:52.533557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.557 [2024-06-11 12:03:52.533578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.557 #16 NEW cov: 11665 ft: 13395 corp: 11/71b lim: 10 exec/s: 0 rss: 69Mb L: 6/9 MS: 1 ChangeBinInt- 00:09:39.816 [2024-06-11 12:03:52.593410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0c cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.593445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.593508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.593528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.816 #17 NEW cov: 11665 ft: 13534 corp: 12/75b lim: 10 exec/s: 17 rss: 69Mb L: 4/9 MS: 1 EraseBytes- 00:09:39.816 [2024-06-11 12:03:52.643409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.643443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.816 #18 NEW cov: 11665 ft: 13542 corp: 13/77b lim: 10 exec/s: 18 rss: 69Mb L: 2/9 MS: 1 CrossOver- 00:09:39.816 [2024-06-11 12:03:52.683925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.683958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.684021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.684040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.684100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.684119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.684181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.684199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:39.816 #19 NEW cov: 11665 ft: 13562 corp: 14/85b lim: 10 exec/s: 19 rss: 69Mb L: 8/9 MS: 1 CrossOver- 00:09:39.816 [2024-06-11 12:03:52.733611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.733644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.816 #20 NEW cov: 11665 ft: 13611 corp: 15/87b lim: 10 exec/s: 20 rss: 69Mb L: 2/9 MS: 1 CopyPart- 00:09:39.816 [2024-06-11 12:03:52.774160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.774193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.774259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.774279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.774344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.774368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.774431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002500 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.774450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:39.816 #21 NEW cov: 11665 ft: 13621 corp: 16/95b lim: 10 exec/s: 21 rss: 69Mb L: 8/9 MS: 1 ChangeByte- 00:09:39.816 [2024-06-11 12:03:52.834340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.834378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.834444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002500 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.834463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.834530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.834548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:39.816 [2024-06-11 12:03:52.834611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:39.816 [2024-06-11 12:03:52.834630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.075 #22 NEW cov: 11665 ft: 13663 corp: 17/103b lim: 10 exec/s: 22 rss: 69Mb L: 8/9 MS: 1 CrossOver- 00:09:40.075 [2024-06-11 12:03:52.894226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:40.075 [2024-06-11 12:03:52.894260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.075 [2024-06-11 12:03:52.894322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.075 [2024-06-11 12:03:52.894342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.075 #23 NEW cov: 11665 ft: 13687 corp: 18/108b lim: 10 exec/s: 23 rss: 69Mb L: 5/9 MS: 1 EraseBytes- 00:09:40.075 [2024-06-11 12:03:52.944696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:40.075 [2024-06-11 12:03:52.944729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:52.944795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.076 [2024-06-11 12:03:52.944815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:52.944878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.076 [2024-06-11 12:03:52.944896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:52.944960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.076 [2024-06-11 12:03:52.944979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.076 #24 NEW cov: 11665 ft: 13698 corp: 19/117b lim: 10 exec/s: 24 rss: 70Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:40.076 [2024-06-11 12:03:52.994727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0c cdw11:00000000 00:09:40.076 [2024-06-11 12:03:52.994760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:52.994824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:40.076 [2024-06-11 12:03:52.994844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:52.994906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.076 [2024-06-11 12:03:52.994925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.076 #25 NEW cov: 11665 ft: 13750 corp: 20/124b lim: 10 exec/s: 25 rss: 70Mb L: 7/9 MS: 1 CopyPart- 00:09:40.076 [2024-06-11 12:03:53.044793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 00:09:40.076 [2024-06-11 12:03:53.044827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:53.044896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0c cdw11:00000000 00:09:40.076 [2024-06-11 12:03:53.044915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.076 [2024-06-11 12:03:53.044981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000c00 cdw11:00000000 00:09:40.076 [2024-06-11 12:03:53.045001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.076 #26 NEW cov: 11665 ft: 13799 corp: 21/131b lim: 10 exec/s: 26 rss: 70Mb L: 7/9 MS: 1 ShuffleBytes- 00:09:40.076 [2024-06-11 12:03:53.104728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:09:40.076 [2024-06-11 12:03:53.104763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.334 #28 NEW cov: 11665 ft: 13809 corp: 22/133b lim: 10 exec/s: 28 rss: 70Mb L: 2/9 MS: 2 ShuffleBytes-InsertByte- 00:09:40.334 [2024-06-11 12:03:53.144959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:09:40.334 [2024-06-11 12:03:53.144992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.334 [2024-06-11 12:03:53.145060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000025 cdw11:00000000 00:09:40.334 [2024-06-11 12:03:53.145079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.334 #29 NEW cov: 11665 ft: 13820 corp: 23/138b lim: 10 exec/s: 29 rss: 70Mb L: 5/9 MS: 1 CrossOver- 00:09:40.334 [2024-06-11 12:03:53.185167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:40.334 [2024-06-11 12:03:53.185200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.334 [2024-06-11 12:03:53.185265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.334 [2024-06-11 12:03:53.185284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.334 [2024-06-11 12:03:53.185346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.334 [2024-06-11 12:03:53.185370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.334 #30 NEW cov: 11665 ft: 13843 corp: 24/145b lim: 10 exec/s: 30 rss: 70Mb L: 7/9 MS: 1 CrossOver- 00:09:40.334 [2024-06-11 12:03:53.225145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:40.334 [2024-06-11 12:03:53.225179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.334 [2024-06-11 12:03:53.225247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.225267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.335 #31 NEW cov: 11665 ft: 13881 corp: 25/149b lim: 10 exec/s: 31 rss: 70Mb L: 4/9 MS: 1 EraseBytes- 00:09:40.335 [2024-06-11 12:03:53.275733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.275768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.335 [2024-06-11 12:03:53.275834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003b0c cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.275858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.335 [2024-06-11 12:03:53.275897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.275916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.335 [2024-06-11 12:03:53.275981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.275999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.335 [2024-06-11 12:03:53.276065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.276083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:40.335 #32 NEW cov: 11665 ft: 13922 corp: 26/159b lim: 10 exec/s: 32 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:09:40.335 [2024-06-11 12:03:53.335661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ecf3 cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.335695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.335 [2024-06-11 12:03:53.335761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f3f3 cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.335781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.335 [2024-06-11 12:03:53.335844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000001d cdw11:00000000 00:09:40.335 [2024-06-11 12:03:53.335863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.593 #33 NEW cov: 11665 ft: 13930 corp: 27/165b lim: 10 exec/s: 33 rss: 70Mb L: 6/10 MS: 1 ChangeByte- 00:09:40.594 [2024-06-11 12:03:53.395817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.395852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.395915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002500 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.395936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.395997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.396016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.594 #34 NEW cov: 11665 ft: 13970 corp: 28/172b lim: 10 exec/s: 34 rss: 70Mb L: 7/10 MS: 1 ChangeBinInt- 00:09:40.594 [2024-06-11 12:03:53.445948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ecf3 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.445983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.446048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000083f3 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.446068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.446130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000001d cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.446148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.594 #35 NEW cov: 11665 ft: 13976 corp: 29/178b lim: 10 exec/s: 35 rss: 70Mb L: 6/10 MS: 1 ChangeByte- 00:09:40.594 [2024-06-11 12:03:53.506451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.506485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.506550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003b0c cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.506569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.506633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.506652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.506715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000c00 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.506733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.506796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000003f cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.506815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:40.594 #36 NEW cov: 11665 ft: 13981 corp: 30/188b lim: 10 exec/s: 36 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:09:40.594 [2024-06-11 12:03:53.566262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ecf3 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.566298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.566370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000f300 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.566391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:40.594 [2024-06-11 12:03:53.566453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f300 cdw11:00000000 00:09:40.594 [2024-06-11 12:03:53.566474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:40.594 #37 NEW cov: 11665 ft: 14006 corp: 31/194b lim: 10 exec/s: 18 rss: 70Mb L: 6/10 MS: 1 ShuffleBytes- 00:09:40.594 #37 DONE cov: 11665 ft: 14006 corp: 31/194b lim: 10 exec/s: 18 rss: 70Mb 00:09:40.594 Done 37 runs in 2 second(s) 00:09:40.853 12:03:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:09:40.853 12:03:53 -- ../common.sh@72 -- # (( i++ )) 00:09:40.853 12:03:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:40.853 12:03:53 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:09:40.853 12:03:53 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:09:40.853 12:03:53 -- nvmf/run.sh@24 -- # local timen=1 00:09:40.853 12:03:53 -- nvmf/run.sh@25 -- # local core=0x1 00:09:40.853 12:03:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:09:40.853 12:03:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:09:40.853 12:03:53 -- nvmf/run.sh@29 -- # printf %02d 8 00:09:40.853 12:03:53 -- nvmf/run.sh@29 -- # port=4408 00:09:40.853 12:03:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:09:40.853 12:03:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:09:40.853 12:03:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:40.853 12:03:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:09:40.853 [2024-06-11 12:03:53.774003] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:40.853 [2024-06-11 12:03:53.774078] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2693909 ] 00:09:40.853 EAL: No free 2048 kB hugepages reported on node 1 00:09:41.112 [2024-06-11 12:03:54.027300] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:41.112 [2024-06-11 12:03:54.053637] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:41.112 [2024-06-11 12:03:54.053810] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.112 [2024-06-11 12:03:54.108595] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:41.112 [2024-06-11 12:03:54.124818] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:09:41.112 INFO: Running with entropic power schedule (0xFF, 100). 00:09:41.112 INFO: Seed: 1619193007 00:09:41.371 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:41.371 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:41.371 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:09:41.371 INFO: A corpus is not provided, starting from an empty corpus 00:09:41.371 [2024-06-11 12:03:54.174203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.174243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.371 #2 INITED cov: 11464 ft: 11463 corp: 1/1b exec/s: 0 rss: 67Mb 00:09:41.371 [2024-06-11 12:03:54.214148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.214183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.371 #3 NEW cov: 11579 ft: 12040 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:09:41.371 [2024-06-11 12:03:54.275133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.275166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.275235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.275256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.275323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.275341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.275416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.275435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.275506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.275529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:41.371 #4 NEW cov: 11585 ft: 13053 corp: 3/7b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:41.371 [2024-06-11 12:03:54.334480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.334513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.371 #5 NEW cov: 11670 ft: 13255 corp: 4/8b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:41.371 [2024-06-11 12:03:54.385179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.385214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.385287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.385306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.385375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.385394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:41.371 [2024-06-11 12:03:54.385462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.371 [2024-06-11 12:03:54.385481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:41.630 #6 NEW cov: 11670 ft: 13397 corp: 5/12b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:09:41.630 [2024-06-11 12:03:54.434756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.434790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.630 #7 NEW cov: 11670 ft: 13473 corp: 6/13b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:09:41.630 [2024-06-11 12:03:54.484923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.484957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.630 #8 NEW cov: 11670 ft: 13513 corp: 7/14b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:09:41.630 [2024-06-11 12:03:54.535102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.535136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.630 #9 NEW cov: 11670 ft: 13626 corp: 8/15b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:09:41.630 [2024-06-11 12:03:54.595804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.595839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.630 [2024-06-11 12:03:54.595907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.595930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.630 [2024-06-11 12:03:54.595998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.596019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:41.630 [2024-06-11 12:03:54.596088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.596107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:41.630 #10 NEW cov: 11670 ft: 13664 corp: 9/19b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 EraseBytes- 00:09:41.630 [2024-06-11 12:03:54.655450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.630 [2024-06-11 12:03:54.655485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.889 #11 NEW cov: 11670 ft: 13764 corp: 10/20b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:09:41.889 [2024-06-11 12:03:54.695499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.695534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.889 #12 NEW cov: 11670 ft: 13777 corp: 11/21b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:09:41.889 [2024-06-11 12:03:54.745660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.745694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.889 #13 NEW cov: 11670 ft: 13794 corp: 12/22b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:09:41.889 [2024-06-11 12:03:54.806419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.806454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.806526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.806545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.806613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.806632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.806699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.806718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:41.889 #14 NEW cov: 11670 ft: 13854 corp: 13/26b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 ChangeByte- 00:09:41.889 [2024-06-11 12:03:54.866790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.866827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.866898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.866918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.866986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.867005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.867071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.867090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.867155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.867173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:41.889 #15 NEW cov: 11670 ft: 13875 corp: 14/31b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:09:41.889 [2024-06-11 12:03:54.916347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.916399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:41.889 [2024-06-11 12:03:54.916471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:41.889 [2024-06-11 12:03:54.916491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.147 #16 NEW cov: 11670 ft: 14065 corp: 15/33b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:09:42.147 [2024-06-11 12:03:54.966337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.147 [2024-06-11 12:03:54.966376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.147 #17 NEW cov: 11670 ft: 14102 corp: 16/34b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:09:42.147 [2024-06-11 12:03:55.026523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.147 [2024-06-11 12:03:55.026556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.147 #18 NEW cov: 11670 ft: 14124 corp: 17/35b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:42.147 [2024-06-11 12:03:55.077181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.147 [2024-06-11 12:03:55.077214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.148 [2024-06-11 12:03:55.077284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.148 [2024-06-11 12:03:55.077304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.148 [2024-06-11 12:03:55.077376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.148 [2024-06-11 12:03:55.077395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.148 [2024-06-11 12:03:55.077462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.148 [2024-06-11 12:03:55.077480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.406 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:42.406 #19 NEW cov: 11693 ft: 14175 corp: 18/39b lim: 5 exec/s: 19 rss: 69Mb L: 4/5 MS: 1 CMP- DE: "\010\000\000\000"- 00:09:42.406 [2024-06-11 12:03:55.408452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.406 [2024-06-11 12:03:55.408498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.406 [2024-06-11 12:03:55.408570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.406 [2024-06-11 12:03:55.408589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.406 [2024-06-11 12:03:55.408658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.406 [2024-06-11 12:03:55.408677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.406 [2024-06-11 12:03:55.408747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.406 [2024-06-11 12:03:55.408766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.406 [2024-06-11 12:03:55.408835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.407 [2024-06-11 12:03:55.408854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:42.665 #20 NEW cov: 11693 ft: 14202 corp: 19/44b lim: 5 exec/s: 20 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:09:42.665 [2024-06-11 12:03:55.468564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.665 [2024-06-11 12:03:55.468600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.665 [2024-06-11 12:03:55.468672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.665 [2024-06-11 12:03:55.468692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.665 [2024-06-11 12:03:55.468760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.665 [2024-06-11 12:03:55.468779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.665 [2024-06-11 12:03:55.468846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.665 [2024-06-11 12:03:55.468864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.666 [2024-06-11 12:03:55.468936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.468955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:42.666 #21 NEW cov: 11693 ft: 14229 corp: 20/49b lim: 5 exec/s: 21 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:09:42.666 [2024-06-11 12:03:55.517861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.517895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.666 #22 NEW cov: 11693 ft: 14237 corp: 21/50b lim: 5 exec/s: 22 rss: 69Mb L: 1/5 MS: 1 ChangeByte- 00:09:42.666 [2024-06-11 12:03:55.557982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.558017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.666 #23 NEW cov: 11693 ft: 14247 corp: 22/51b lim: 5 exec/s: 23 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:09:42.666 [2024-06-11 12:03:55.618404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.618437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.666 [2024-06-11 12:03:55.618510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.618530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.666 #24 NEW cov: 11693 ft: 14258 corp: 23/53b lim: 5 exec/s: 24 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:09:42.666 [2024-06-11 12:03:55.678593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.678626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.666 [2024-06-11 12:03:55.678695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.666 [2024-06-11 12:03:55.678715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.925 #25 NEW cov: 11693 ft: 14273 corp: 24/55b lim: 5 exec/s: 25 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:09:42.925 [2024-06-11 12:03:55.729260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.729294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.729369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.729388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.729459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.729478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.729555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.729573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.729645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.729663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:42.925 #26 NEW cov: 11693 ft: 14290 corp: 25/60b lim: 5 exec/s: 26 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:09:42.925 [2024-06-11 12:03:55.789083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.789117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.789189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.789208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.789278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.789297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.925 #27 NEW cov: 11693 ft: 14452 corp: 26/63b lim: 5 exec/s: 27 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:09:42.925 [2024-06-11 12:03:55.839191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.839225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.839296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.839315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.839388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.839408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.925 #28 NEW cov: 11693 ft: 14459 corp: 27/66b lim: 5 exec/s: 28 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:09:42.925 [2024-06-11 12:03:55.889715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.889749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.889819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.889840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.889909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.889928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.890003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.890022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:42.925 [2024-06-11 12:03:55.890093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.890112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:42.925 #29 NEW cov: 11693 ft: 14480 corp: 28/71b lim: 5 exec/s: 29 rss: 70Mb L: 5/5 MS: 1 ChangeBinInt- 00:09:42.925 [2024-06-11 12:03:55.939093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:42.925 [2024-06-11 12:03:55.939126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.184 #30 NEW cov: 11693 ft: 14510 corp: 29/72b lim: 5 exec/s: 30 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:43.184 [2024-06-11 12:03:55.999746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.184 [2024-06-11 12:03:55.999781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.184 [2024-06-11 12:03:55.999853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.184 [2024-06-11 12:03:55.999873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.184 [2024-06-11 12:03:55.999941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.184 [2024-06-11 12:03:55.999959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.184 #31 NEW cov: 11693 ft: 14515 corp: 30/75b lim: 5 exec/s: 31 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:09:43.184 [2024-06-11 12:03:56.060248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.060282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.060357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.060384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.060448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.060467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.060537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.060556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.060623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.060645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:43.185 #32 NEW cov: 11693 ft: 14523 corp: 31/80b lim: 5 exec/s: 32 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:09:43.185 [2024-06-11 12:03:56.120154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.120189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.120261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.120280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.120350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.120375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:43.185 [2024-06-11 12:03:56.120447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.120466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:43.185 #33 NEW cov: 11693 ft: 14529 corp: 32/84b lim: 5 exec/s: 33 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:09:43.185 [2024-06-11 12:03:56.169797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.185 [2024-06-11 12:03:56.169832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.185 #34 NEW cov: 11693 ft: 14534 corp: 33/85b lim: 5 exec/s: 17 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:09:43.185 #34 DONE cov: 11693 ft: 14534 corp: 33/85b lim: 5 exec/s: 17 rss: 70Mb 00:09:43.185 ###### Recommended dictionary. ###### 00:09:43.185 "\010\000\000\000" # Uses: 0 00:09:43.185 ###### End of recommended dictionary. ###### 00:09:43.185 Done 34 runs in 2 second(s) 00:09:43.444 12:03:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:09:43.444 12:03:56 -- ../common.sh@72 -- # (( i++ )) 00:09:43.444 12:03:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:43.444 12:03:56 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:09:43.444 12:03:56 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:09:43.444 12:03:56 -- nvmf/run.sh@24 -- # local timen=1 00:09:43.444 12:03:56 -- nvmf/run.sh@25 -- # local core=0x1 00:09:43.444 12:03:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:43.444 12:03:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:09:43.444 12:03:56 -- nvmf/run.sh@29 -- # printf %02d 9 00:09:43.444 12:03:56 -- nvmf/run.sh@29 -- # port=4409 00:09:43.444 12:03:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:43.444 12:03:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:09:43.444 12:03:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:43.444 12:03:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:09:43.444 [2024-06-11 12:03:56.373595] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:43.444 [2024-06-11 12:03:56.373675] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694276 ] 00:09:43.444 EAL: No free 2048 kB hugepages reported on node 1 00:09:43.703 [2024-06-11 12:03:56.623048] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:43.703 [2024-06-11 12:03:56.649787] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:43.703 [2024-06-11 12:03:56.649962] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.703 [2024-06-11 12:03:56.704452] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:43.703 [2024-06-11 12:03:56.720699] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:09:43.961 INFO: Running with entropic power schedule (0xFF, 100). 00:09:43.961 INFO: Seed: 4217132700 00:09:43.961 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:43.961 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:43.961 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:43.961 INFO: A corpus is not provided, starting from an empty corpus 00:09:43.962 [2024-06-11 12:03:56.776219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.962 [2024-06-11 12:03:56.776256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.962 #2 INITED cov: 11466 ft: 11467 corp: 1/1b exec/s: 0 rss: 67Mb 00:09:43.962 [2024-06-11 12:03:56.816160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.962 [2024-06-11 12:03:56.816197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.962 #3 NEW cov: 11579 ft: 11930 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeBit- 00:09:43.962 [2024-06-11 12:03:56.876559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.962 [2024-06-11 12:03:56.876593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.962 [2024-06-11 12:03:56.876661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.962 [2024-06-11 12:03:56.876680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:43.962 #4 NEW cov: 11585 ft: 12830 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:09:43.962 [2024-06-11 12:03:56.926533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.962 [2024-06-11 12:03:56.926566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:43.962 #5 NEW cov: 11670 ft: 12998 corp: 4/5b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:09:43.962 [2024-06-11 12:03:56.976703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:43.962 [2024-06-11 12:03:56.976736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.220 #6 NEW cov: 11670 ft: 13130 corp: 5/6b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 EraseBytes- 00:09:44.220 [2024-06-11 12:03:57.036819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.220 [2024-06-11 12:03:57.036853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.220 #7 NEW cov: 11670 ft: 13219 corp: 6/7b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeByte- 00:09:44.221 [2024-06-11 12:03:57.097017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.221 [2024-06-11 12:03:57.097050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.221 #8 NEW cov: 11670 ft: 13259 corp: 7/8b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeByte- 00:09:44.221 [2024-06-11 12:03:57.157168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.221 [2024-06-11 12:03:57.157203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.221 #9 NEW cov: 11670 ft: 13364 corp: 8/9b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeByte- 00:09:44.221 [2024-06-11 12:03:57.207297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.221 [2024-06-11 12:03:57.207334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.221 #10 NEW cov: 11670 ft: 13427 corp: 9/10b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeByte- 00:09:44.478 [2024-06-11 12:03:57.268215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.268249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.478 [2024-06-11 12:03:57.268314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.268334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:44.478 [2024-06-11 12:03:57.268397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.268416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:44.478 [2024-06-11 12:03:57.268478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.268496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:44.478 [2024-06-11 12:03:57.268558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.268577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:44.478 #11 NEW cov: 11670 ft: 13810 corp: 10/15b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:44.478 [2024-06-11 12:03:57.327644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.327679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.478 #12 NEW cov: 11670 ft: 13837 corp: 11/16b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:09:44.478 [2024-06-11 12:03:57.367759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.367794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.478 #13 NEW cov: 11670 ft: 13859 corp: 12/17b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:44.478 [2024-06-11 12:03:57.417958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.417992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.478 #14 NEW cov: 11670 ft: 13888 corp: 13/18b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:44.478 [2024-06-11 12:03:57.478259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.478293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.478 [2024-06-11 12:03:57.478365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.478 [2024-06-11 12:03:57.478384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:44.736 #15 NEW cov: 11670 ft: 13901 corp: 14/20b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:09:44.736 [2024-06-11 12:03:57.528242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.528278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.736 #16 NEW cov: 11670 ft: 13920 corp: 15/21b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:09:44.736 [2024-06-11 12:03:57.568509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.568543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.568611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.568630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:44.736 #17 NEW cov: 11670 ft: 13939 corp: 16/23b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:09:44.736 [2024-06-11 12:03:57.619149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.619183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.619252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.619272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.619338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.619356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.619425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.619443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.619507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.619530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:44.736 #18 NEW cov: 11670 ft: 13973 corp: 17/28b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:44.736 [2024-06-11 12:03:57.669384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.669417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.669483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.669502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.669564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.669582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.669644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.669664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:44.736 [2024-06-11 12:03:57.669725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.736 [2024-06-11 12:03:57.669744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:44.994 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:44.994 #19 NEW cov: 11693 ft: 14012 corp: 18/33b lim: 5 exec/s: 19 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:09:44.994 [2024-06-11 12:03:57.999750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:44.994 [2024-06-11 12:03:57.999794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.252 #20 NEW cov: 11693 ft: 14041 corp: 19/34b lim: 5 exec/s: 20 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:09:45.252 [2024-06-11 12:03:58.050330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.050371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.050446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.050465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.050535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.050554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.050625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.050644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:45.252 #21 NEW cov: 11693 ft: 14076 corp: 20/38b lim: 5 exec/s: 21 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:09:45.252 [2024-06-11 12:03:58.110655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.110689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.110759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.110779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.110849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.110868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.110935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.110954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.111022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.111041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:45.252 #22 NEW cov: 11693 ft: 14097 corp: 21/43b lim: 5 exec/s: 22 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:09:45.252 [2024-06-11 12:03:58.170488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.170522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.170593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.170613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.170684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.170702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.252 #23 NEW cov: 11693 ft: 14325 corp: 22/46b lim: 5 exec/s: 23 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:09:45.252 [2024-06-11 12:03:58.230271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.230306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.252 #24 NEW cov: 11693 ft: 14353 corp: 23/47b lim: 5 exec/s: 24 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:09:45.252 [2024-06-11 12:03:58.280589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.280622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.252 [2024-06-11 12:03:58.280694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.252 [2024-06-11 12:03:58.280718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.511 #25 NEW cov: 11693 ft: 14359 corp: 24/49b lim: 5 exec/s: 25 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:09:45.511 [2024-06-11 12:03:58.340776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.340809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.511 [2024-06-11 12:03:58.340878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.340898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.511 #26 NEW cov: 11693 ft: 14398 corp: 25/51b lim: 5 exec/s: 26 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:09:45.511 [2024-06-11 12:03:58.390913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.390947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.511 [2024-06-11 12:03:58.391018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.391038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.511 #27 NEW cov: 11693 ft: 14415 corp: 26/53b lim: 5 exec/s: 27 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:09:45.511 [2024-06-11 12:03:58.441209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.441243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.511 [2024-06-11 12:03:58.441317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.441336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.511 [2024-06-11 12:03:58.441409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.441428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.511 #28 NEW cov: 11693 ft: 14428 corp: 27/56b lim: 5 exec/s: 28 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:09:45.511 [2024-06-11 12:03:58.500992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.511 [2024-06-11 12:03:58.501026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.511 #29 NEW cov: 11693 ft: 14433 corp: 28/57b lim: 5 exec/s: 29 rss: 70Mb L: 1/5 MS: 1 CopyPart- 00:09:45.770 [2024-06-11 12:03:58.561380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.561413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.561487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.561510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.770 #30 NEW cov: 11693 ft: 14437 corp: 29/59b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:09:45.770 [2024-06-11 12:03:58.621727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.621760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.621832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.621852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.621922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.621940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.770 #31 NEW cov: 11693 ft: 14453 corp: 30/62b lim: 5 exec/s: 31 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:09:45.770 [2024-06-11 12:03:58.682084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.682119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.682191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.682210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.682279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.682297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.682371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.682391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:45.770 #32 NEW cov: 11693 ft: 14458 corp: 31/66b lim: 5 exec/s: 32 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:09:45.770 [2024-06-11 12:03:58.732191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.732226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.732299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.732318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.732395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.732415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:45.770 [2024-06-11 12:03:58.732484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:45.770 [2024-06-11 12:03:58.732508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:45.770 #33 NEW cov: 11693 ft: 14492 corp: 32/70b lim: 5 exec/s: 16 rss: 70Mb L: 4/5 MS: 1 CopyPart- 00:09:45.770 #33 DONE cov: 11693 ft: 14492 corp: 32/70b lim: 5 exec/s: 16 rss: 70Mb 00:09:45.770 Done 33 runs in 2 second(s) 00:09:46.028 12:03:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:09:46.028 12:03:58 -- ../common.sh@72 -- # (( i++ )) 00:09:46.028 12:03:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:46.028 12:03:58 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:09:46.028 12:03:58 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:09:46.028 12:03:58 -- nvmf/run.sh@24 -- # local timen=1 00:09:46.028 12:03:58 -- nvmf/run.sh@25 -- # local core=0x1 00:09:46.028 12:03:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:46.028 12:03:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:09:46.028 12:03:58 -- nvmf/run.sh@29 -- # printf %02d 10 00:09:46.028 12:03:58 -- nvmf/run.sh@29 -- # port=4410 00:09:46.028 12:03:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:46.028 12:03:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:09:46.028 12:03:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:46.029 12:03:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:09:46.029 [2024-06-11 12:03:58.956023] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:46.029 [2024-06-11 12:03:58.956110] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694585 ] 00:09:46.029 EAL: No free 2048 kB hugepages reported on node 1 00:09:46.287 [2024-06-11 12:03:59.211651] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.287 [2024-06-11 12:03:59.238223] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:46.287 [2024-06-11 12:03:59.238404] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.287 [2024-06-11 12:03:59.292896] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:46.287 [2024-06-11 12:03:59.309134] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:09:46.545 INFO: Running with entropic power schedule (0xFF, 100). 00:09:46.545 INFO: Seed: 2510169483 00:09:46.545 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:46.545 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:46.545 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:46.545 INFO: A corpus is not provided, starting from an empty corpus 00:09:46.545 #2 INITED exec/s: 0 rss: 61Mb 00:09:46.545 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:46.545 This may also happen if the target rejected all inputs we tried so far 00:09:46.545 [2024-06-11 12:03:59.374935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:46.545 [2024-06-11 12:03:59.374973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:46.545 [2024-06-11 12:03:59.375046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:46.545 [2024-06-11 12:03:59.375069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:46.804 NEW_FUNC[1/663]: 0x4ab4e0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:09:46.804 NEW_FUNC[2/663]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:46.804 #3 NEW cov: 11489 ft: 11490 corp: 2/17b lim: 40 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:09:47.062 [2024-06-11 12:03:59.846254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.062 [2024-06-11 12:03:59.846315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.062 [2024-06-11 12:03:59.846416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.062 [2024-06-11 12:03:59.846444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.062 #23 NEW cov: 11602 ft: 12052 corp: 3/34b lim: 40 exec/s: 0 rss: 68Mb L: 17/17 MS: 5 ChangeByte-CopyPart-CopyPart-ChangeByte-CrossOver- 00:09:47.062 [2024-06-11 12:03:59.896165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.062 [2024-06-11 12:03:59.896200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.062 [2024-06-11 12:03:59.896276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.062 [2024-06-11 12:03:59.896296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.062 #24 NEW cov: 11608 ft: 12299 corp: 4/50b lim: 40 exec/s: 0 rss: 68Mb L: 16/17 MS: 1 ChangeByte- 00:09:47.062 [2024-06-11 12:03:59.956661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:03:59.956695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.063 [2024-06-11 12:03:59.956767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:03:59.956786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.063 [2024-06-11 12:03:59.956858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:302f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:03:59.956876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.063 [2024-06-11 12:03:59.956953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:03:59.956972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:47.063 #30 NEW cov: 11693 ft: 13070 corp: 5/84b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:09:47.063 [2024-06-11 12:04:00.016494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:04:00.016528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.063 [2024-06-11 12:04:00.016605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:ffff2f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:04:00.016629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.063 #31 NEW cov: 11693 ft: 13139 corp: 6/100b lim: 40 exec/s: 0 rss: 69Mb L: 16/34 MS: 1 CMP- DE: "\377\377\377\377"- 00:09:47.063 [2024-06-11 12:04:00.076716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:04:00.076757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.063 [2024-06-11 12:04:00.076836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.063 [2024-06-11 12:04:00.076856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.321 #32 NEW cov: 11693 ft: 13277 corp: 7/123b lim: 40 exec/s: 0 rss: 69Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:09:47.321 [2024-06-11 12:04:00.136730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.136766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.321 #33 NEW cov: 11693 ft: 13637 corp: 8/136b lim: 40 exec/s: 0 rss: 69Mb L: 13/34 MS: 1 EraseBytes- 00:09:47.321 [2024-06-11 12:04:00.186967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:2fb42f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.187002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.187077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f30 cdw11:30302f30 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.187096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.321 #34 NEW cov: 11693 ft: 13694 corp: 9/152b lim: 40 exec/s: 0 rss: 69Mb L: 16/34 MS: 1 CrossOver- 00:09:47.321 [2024-06-11 12:04:00.237396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ba2f2f2f cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.237430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.237507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.237528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.237602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:302f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.237622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.237693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.237714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:47.321 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:47.321 #35 NEW cov: 11716 ft: 13752 corp: 10/186b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBinInt- 00:09:47.321 [2024-06-11 12:04:00.297606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.297641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.297715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.297736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.297808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.297827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.321 [2024-06-11 12:04:00.297902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.321 [2024-06-11 12:04:00.297921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:47.321 #36 NEW cov: 11716 ft: 13775 corp: 11/225b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:09:47.580 [2024-06-11 12:04:00.357351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2fff cdw11:ffffff2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.357392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.580 #37 NEW cov: 11716 ft: 13809 corp: 12/238b lim: 40 exec/s: 37 rss: 69Mb L: 13/39 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:09:47.580 [2024-06-11 12:04:00.417670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2fcdd0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.417705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.580 [2024-06-11 12:04:00.417782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d0d00000 cdw11:00002f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.417802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.580 #38 NEW cov: 11716 ft: 13867 corp: 13/254b lim: 40 exec/s: 38 rss: 69Mb L: 16/39 MS: 1 ChangeBinInt- 00:09:47.580 [2024-06-11 12:04:00.467756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.467790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.580 [2024-06-11 12:04:00.467867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:10000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.467886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.580 #39 NEW cov: 11716 ft: 13877 corp: 14/270b lim: 40 exec/s: 39 rss: 69Mb L: 16/39 MS: 1 ChangeBinInt- 00:09:47.580 [2024-06-11 12:04:00.517882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f6b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.517916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.580 [2024-06-11 12:04:00.517994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:10000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.518018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.580 #40 NEW cov: 11716 ft: 13896 corp: 15/286b lim: 40 exec/s: 40 rss: 69Mb L: 16/39 MS: 1 ChangeByte- 00:09:47.580 [2024-06-11 12:04:00.578118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.578152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.580 [2024-06-11 12:04:00.578228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:2f000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.580 [2024-06-11 12:04:00.578248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.580 #41 NEW cov: 11716 ft: 13909 corp: 16/302b lim: 40 exec/s: 41 rss: 69Mb L: 16/39 MS: 1 ChangeByte- 00:09:47.838 [2024-06-11 12:04:00.628262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f322f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.628297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.628379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.628400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.838 #42 NEW cov: 11716 ft: 13935 corp: 17/318b lim: 40 exec/s: 42 rss: 70Mb L: 16/39 MS: 1 ShuffleBytes- 00:09:47.838 [2024-06-11 12:04:00.678442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.678476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.678552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:ffff2f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.678572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.838 #43 NEW cov: 11716 ft: 14005 corp: 18/334b lim: 40 exec/s: 43 rss: 70Mb L: 16/39 MS: 1 CopyPart- 00:09:47.838 [2024-06-11 12:04:00.718843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:305b3030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.718877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.718952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.718972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.719048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:302f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.719067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.719138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.719157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:47.838 #44 NEW cov: 11716 ft: 14018 corp: 19/368b lim: 40 exec/s: 44 rss: 70Mb L: 34/39 MS: 1 ChangeByte- 00:09:47.838 [2024-06-11 12:04:00.768637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f3f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.768670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.768749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:10000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.768769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.838 #45 NEW cov: 11716 ft: 14048 corp: 20/384b lim: 40 exec/s: 45 rss: 70Mb L: 16/39 MS: 1 ChangeBit- 00:09:47.838 [2024-06-11 12:04:00.808805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2fcdd0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.808839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.838 [2024-06-11 12:04:00.808915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:d0d00000 cdw11:00d0d000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.838 [2024-06-11 12:04:00.808935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:47.839 #46 NEW cov: 11716 ft: 14063 corp: 21/404b lim: 40 exec/s: 46 rss: 70Mb L: 20/39 MS: 1 CopyPart- 00:09:47.839 [2024-06-11 12:04:00.868991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3f2f2f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.839 [2024-06-11 12:04:00.869024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:47.839 [2024-06-11 12:04:00.869097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:47.839 [2024-06-11 12:04:00.869116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.096 #47 NEW cov: 11716 ft: 14070 corp: 22/420b lim: 40 exec/s: 47 rss: 70Mb L: 16/39 MS: 1 ChangeByte- 00:09:48.096 [2024-06-11 12:04:00.909245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.096 [2024-06-11 12:04:00.909280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.096 [2024-06-11 12:04:00.909356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:322f2fcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.096 [2024-06-11 12:04:00.909385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.096 [2024-06-11 12:04:00.909458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d0d0d000 cdw11:0000002f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:00.909477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.097 #48 NEW cov: 11716 ft: 14259 corp: 23/445b lim: 40 exec/s: 48 rss: 70Mb L: 25/39 MS: 1 InsertRepeatedBytes- 00:09:48.097 [2024-06-11 12:04:00.959371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:2fb42f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:00.959404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.097 [2024-06-11 12:04:00.959477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f19 cdw11:19191919 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:00.959501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.097 [2024-06-11 12:04:00.959577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:19191919 cdw11:3030302f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:00.959596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.097 #49 NEW cov: 11716 ft: 14285 corp: 24/470b lim: 40 exec/s: 49 rss: 70Mb L: 25/39 MS: 1 InsertRepeatedBytes- 00:09:48.097 [2024-06-11 12:04:01.019385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f32 cdw11:2f2f2f6b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:01.019418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.097 [2024-06-11 12:04:01.019494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:7a2f2fff cdw11:ff100000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:01.019514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.097 #50 NEW cov: 11716 ft: 14294 corp: 25/487b lim: 40 exec/s: 50 rss: 70Mb L: 17/39 MS: 1 InsertByte- 00:09:48.097 [2024-06-11 12:04:01.079578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:01.079611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.097 [2024-06-11 12:04:01.079686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.097 [2024-06-11 12:04:01.079705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.097 #51 NEW cov: 11716 ft: 14364 corp: 26/505b lim: 40 exec/s: 51 rss: 70Mb L: 18/39 MS: 1 InsertByte- 00:09:48.355 [2024-06-11 12:04:01.129854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:af2f2f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.129888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.355 [2024-06-11 12:04:01.129969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:322f2fcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.129989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.355 [2024-06-11 12:04:01.130064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d0d0d000 cdw11:0000002f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.130083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:48.355 #52 NEW cov: 11716 ft: 14385 corp: 27/530b lim: 40 exec/s: 52 rss: 70Mb L: 25/39 MS: 1 ChangeBit- 00:09:48.355 [2024-06-11 12:04:01.189883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b42f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.189918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.355 [2024-06-11 12:04:01.189993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.190014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.355 #53 NEW cov: 11716 ft: 14413 corp: 28/547b lim: 40 exec/s: 53 rss: 70Mb L: 17/39 MS: 1 CrossOver- 00:09:48.355 [2024-06-11 12:04:01.229986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2f2f cdw11:2f2fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.230020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.355 [2024-06-11 12:04:01.230097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.230117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.355 #54 NEW cov: 11716 ft: 14430 corp: 29/567b lim: 40 exec/s: 54 rss: 70Mb L: 20/39 MS: 1 InsertRepeatedBytes- 00:09:48.355 [2024-06-11 12:04:01.280116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f3f32 cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.280150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.355 [2024-06-11 12:04:01.280225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2f2fffff cdw11:10000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.280245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:48.355 #55 NEW cov: 11716 ft: 14435 corp: 30/583b lim: 40 exec/s: 55 rss: 70Mb L: 16/39 MS: 1 ShuffleBytes- 00:09:48.355 [2024-06-11 12:04:01.340161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2f2f2fff cdw11:ff2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:48.355 [2024-06-11 12:04:01.340196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:48.355 #56 NEW cov: 11716 ft: 14439 corp: 31/595b lim: 40 exec/s: 28 rss: 70Mb L: 12/39 MS: 1 CrossOver- 00:09:48.355 #56 DONE cov: 11716 ft: 14439 corp: 31/595b lim: 40 exec/s: 28 rss: 70Mb 00:09:48.355 ###### Recommended dictionary. ###### 00:09:48.355 "\377\377\377\377" # Uses: 1 00:09:48.355 ###### End of recommended dictionary. ###### 00:09:48.355 Done 56 runs in 2 second(s) 00:09:48.613 12:04:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:09:48.613 12:04:01 -- ../common.sh@72 -- # (( i++ )) 00:09:48.613 12:04:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:48.613 12:04:01 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:09:48.613 12:04:01 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:09:48.613 12:04:01 -- nvmf/run.sh@24 -- # local timen=1 00:09:48.613 12:04:01 -- nvmf/run.sh@25 -- # local core=0x1 00:09:48.613 12:04:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:48.613 12:04:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:09:48.613 12:04:01 -- nvmf/run.sh@29 -- # printf %02d 11 00:09:48.613 12:04:01 -- nvmf/run.sh@29 -- # port=4411 00:09:48.613 12:04:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:48.613 12:04:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:09:48.613 12:04:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:48.613 12:04:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:09:48.614 [2024-06-11 12:04:01.562169] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:48.614 [2024-06-11 12:04:01.562269] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694899 ] 00:09:48.614 EAL: No free 2048 kB hugepages reported on node 1 00:09:48.871 [2024-06-11 12:04:01.819436] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.871 [2024-06-11 12:04:01.845645] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:48.871 [2024-06-11 12:04:01.845818] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.871 [2024-06-11 12:04:01.900298] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:49.130 [2024-06-11 12:04:01.916551] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:09:49.130 INFO: Running with entropic power schedule (0xFF, 100). 00:09:49.130 INFO: Seed: 823203121 00:09:49.130 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:49.130 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:49.130 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:49.130 INFO: A corpus is not provided, starting from an empty corpus 00:09:49.130 #2 INITED exec/s: 0 rss: 61Mb 00:09:49.130 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:49.130 This may also happen if the target rejected all inputs we tried so far 00:09:49.130 [2024-06-11 12:04:01.971895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.130 [2024-06-11 12:04:01.971945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.130 [2024-06-11 12:04:01.971998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.130 [2024-06-11 12:04:01.972023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.130 [2024-06-11 12:04:01.972071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.130 [2024-06-11 12:04:01.972096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:49.389 NEW_FUNC[1/664]: 0x4ad250 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:09:49.389 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:49.389 #11 NEW cov: 11501 ft: 11502 corp: 2/26b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 4 ChangeByte-CrossOver-CrossOver-InsertRepeatedBytes- 00:09:49.389 [2024-06-11 12:04:02.332823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:fffffeff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.389 [2024-06-11 12:04:02.332880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.389 [2024-06-11 12:04:02.332932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.389 [2024-06-11 12:04:02.332956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.389 [2024-06-11 12:04:02.333002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.389 [2024-06-11 12:04:02.333026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:49.389 #12 NEW cov: 11614 ft: 11995 corp: 3/51b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeBit- 00:09:49.647 [2024-06-11 12:04:02.432864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.647 [2024-06-11 12:04:02.432909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.647 [2024-06-11 12:04:02.432960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.647 [2024-06-11 12:04:02.432984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.647 #18 NEW cov: 11620 ft: 12451 corp: 4/71b lim: 40 exec/s: 0 rss: 69Mb L: 20/25 MS: 1 InsertRepeatedBytes- 00:09:49.647 [2024-06-11 12:04:02.503011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffdaffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.647 [2024-06-11 12:04:02.503054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.647 [2024-06-11 12:04:02.503105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.647 [2024-06-11 12:04:02.503129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.647 #19 NEW cov: 11705 ft: 12656 corp: 5/92b lim: 40 exec/s: 0 rss: 69Mb L: 21/25 MS: 1 InsertByte- 00:09:49.647 [2024-06-11 12:04:02.593182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:1a1a0a0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.647 [2024-06-11 12:04:02.593225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.647 #22 NEW cov: 11705 ft: 13520 corp: 6/107b lim: 40 exec/s: 0 rss: 69Mb L: 15/25 MS: 3 ChangeBit-CopyPart-CrossOver- 00:09:49.647 [2024-06-11 12:04:02.673668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.647 [2024-06-11 12:04:02.673710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.647 [2024-06-11 12:04:02.673761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.648 [2024-06-11 12:04:02.673786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.648 [2024-06-11 12:04:02.673832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.648 [2024-06-11 12:04:02.673855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:49.648 [2024-06-11 12:04:02.673900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.648 [2024-06-11 12:04:02.673923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:49.906 #23 NEW cov: 11705 ft: 13856 corp: 7/139b lim: 40 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:09:49.906 [2024-06-11 12:04:02.743616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:1a1a0a0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.743657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.906 #24 NEW cov: 11705 ft: 13960 corp: 8/154b lim: 40 exec/s: 0 rss: 69Mb L: 15/32 MS: 1 CrossOver- 00:09:49.906 [2024-06-11 12:04:02.844035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.844086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.906 [2024-06-11 12:04:02.844137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.844161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.906 [2024-06-11 12:04:02.844207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000fdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.844230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:49.906 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:49.906 #30 NEW cov: 11722 ft: 14067 corp: 9/179b lim: 40 exec/s: 0 rss: 69Mb L: 25/32 MS: 1 ChangeBinInt- 00:09:49.906 [2024-06-11 12:04:02.914262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.914305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:49.906 [2024-06-11 12:04:02.914356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.914391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:49.906 [2024-06-11 12:04:02.914437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffe6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:49.906 [2024-06-11 12:04:02.914461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.165 #31 NEW cov: 11722 ft: 14201 corp: 10/205b lim: 40 exec/s: 31 rss: 69Mb L: 26/32 MS: 1 InsertByte- 00:09:50.165 [2024-06-11 12:04:02.984475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff5dffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.165 [2024-06-11 12:04:02.984519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.165 [2024-06-11 12:04:02.984571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.165 [2024-06-11 12:04:02.984595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.165 [2024-06-11 12:04:02.984641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.165 [2024-06-11 12:04:02.984664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.165 #32 NEW cov: 11722 ft: 14307 corp: 11/232b lim: 40 exec/s: 32 rss: 69Mb L: 27/32 MS: 1 InsertByte- 00:09:50.165 [2024-06-11 12:04:03.074525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:1a1a0a0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.165 [2024-06-11 12:04:03.074568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.165 #33 NEW cov: 11722 ft: 14359 corp: 12/247b lim: 40 exec/s: 33 rss: 69Mb L: 15/32 MS: 1 ShuffleBytes- 00:09:50.165 [2024-06-11 12:04:03.164864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affff01 cdw11:00007bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.165 [2024-06-11 12:04:03.164907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.165 [2024-06-11 12:04:03.164963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.165 [2024-06-11 12:04:03.164988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.425 #34 NEW cov: 11722 ft: 14370 corp: 13/267b lim: 40 exec/s: 34 rss: 69Mb L: 20/32 MS: 1 CMP- DE: "\001\000\000{"- 00:09:50.425 [2024-06-11 12:04:03.235208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:7bffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.235251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.235301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.235326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.235380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0affff01 cdw11:00007bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.235404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.235449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.235472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.425 #35 NEW cov: 11722 ft: 14396 corp: 14/303b lim: 40 exec/s: 35 rss: 70Mb L: 36/36 MS: 1 CopyPart- 00:09:50.425 [2024-06-11 12:04:03.325518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:fffffeff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.325561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.325611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.325635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.325683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0affffff cdw11:fffeffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.325706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.325751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.325774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.425 #36 NEW cov: 11722 ft: 14420 corp: 15/339b lim: 40 exec/s: 36 rss: 70Mb L: 36/36 MS: 1 CrossOver- 00:09:50.425 [2024-06-11 12:04:03.415760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.415803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.415854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00008200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.415878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.415929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.415953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.425 [2024-06-11 12:04:03.415998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.425 [2024-06-11 12:04:03.416021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.684 #37 NEW cov: 11722 ft: 14434 corp: 16/372b lim: 40 exec/s: 37 rss: 70Mb L: 33/36 MS: 1 InsertByte- 00:09:50.684 [2024-06-11 12:04:03.505799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affff01 cdw11:00007bff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.505841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.684 [2024-06-11 12:04:03.505892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.505916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.684 #38 NEW cov: 11722 ft: 14452 corp: 17/392b lim: 40 exec/s: 38 rss: 70Mb L: 20/36 MS: 1 PersAutoDict- DE: "\001\000\000{"- 00:09:50.684 [2024-06-11 12:04:03.575993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.576036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.684 [2024-06-11 12:04:03.576086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.576110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.684 #39 NEW cov: 11722 ft: 14455 corp: 18/412b lim: 40 exec/s: 39 rss: 70Mb L: 20/36 MS: 1 ShuffleBytes- 00:09:50.684 [2024-06-11 12:04:03.646401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.646443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.684 [2024-06-11 12:04:03.646495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00008200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.646519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.684 [2024-06-11 12:04:03.646566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.646589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.684 [2024-06-11 12:04:03.646633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.684 [2024-06-11 12:04:03.646656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.944 #40 NEW cov: 11722 ft: 14479 corp: 19/445b lim: 40 exec/s: 40 rss: 70Mb L: 33/36 MS: 1 ChangeBit- 00:09:50.944 [2024-06-11 12:04:03.736389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffdaffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.736437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.944 [2024-06-11 12:04:03.736488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.736512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.944 #41 NEW cov: 11722 ft: 14503 corp: 20/466b lim: 40 exec/s: 41 rss: 70Mb L: 21/36 MS: 1 ShuffleBytes- 00:09:50.944 [2024-06-11 12:04:03.826872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0a0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.826914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.944 [2024-06-11 12:04:03.826965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00820000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.826989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.944 [2024-06-11 12:04:03.827035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000100ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.827058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.944 [2024-06-11 12:04:03.827103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.827126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:50.944 #44 NEW cov: 11729 ft: 14568 corp: 21/502b lim: 40 exec/s: 44 rss: 70Mb L: 36/36 MS: 3 CrossOver-ShuffleBytes-CrossOver- 00:09:50.944 [2024-06-11 12:04:03.917027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:0019ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.917070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:50.944 [2024-06-11 12:04:03.917121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.917145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:50.944 [2024-06-11 12:04:03.917191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000fdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:50.944 [2024-06-11 12:04:03.917215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:50.944 #45 NEW cov: 11729 ft: 14590 corp: 22/527b lim: 40 exec/s: 22 rss: 70Mb L: 25/36 MS: 1 ChangeBinInt- 00:09:51.204 #45 DONE cov: 11729 ft: 14590 corp: 22/527b lim: 40 exec/s: 22 rss: 70Mb 00:09:51.204 ###### Recommended dictionary. ###### 00:09:51.204 "\001\000\000{" # Uses: 1 00:09:51.204 ###### End of recommended dictionary. ###### 00:09:51.204 Done 45 runs in 2 second(s) 00:09:51.204 12:04:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:09:51.204 12:04:04 -- ../common.sh@72 -- # (( i++ )) 00:09:51.204 12:04:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:51.204 12:04:04 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:09:51.204 12:04:04 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:09:51.204 12:04:04 -- nvmf/run.sh@24 -- # local timen=1 00:09:51.204 12:04:04 -- nvmf/run.sh@25 -- # local core=0x1 00:09:51.204 12:04:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:51.204 12:04:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:09:51.204 12:04:04 -- nvmf/run.sh@29 -- # printf %02d 12 00:09:51.204 12:04:04 -- nvmf/run.sh@29 -- # port=4412 00:09:51.204 12:04:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:51.204 12:04:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:09:51.204 12:04:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:51.204 12:04:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:09:51.204 [2024-06-11 12:04:04.157614] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:51.204 [2024-06-11 12:04:04.157683] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2695207 ] 00:09:51.204 EAL: No free 2048 kB hugepages reported on node 1 00:09:51.463 [2024-06-11 12:04:04.420203] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:51.463 [2024-06-11 12:04:04.446820] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:51.463 [2024-06-11 12:04:04.446993] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:51.723 [2024-06-11 12:04:04.501587] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:51.723 [2024-06-11 12:04:04.517829] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:09:51.723 INFO: Running with entropic power schedule (0xFF, 100). 00:09:51.723 INFO: Seed: 3423205844 00:09:51.723 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:51.723 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:51.723 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:51.723 INFO: A corpus is not provided, starting from an empty corpus 00:09:51.723 #2 INITED exec/s: 0 rss: 61Mb 00:09:51.723 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:51.723 This may also happen if the target rejected all inputs we tried so far 00:09:51.723 [2024-06-11 12:04:04.573688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.723 [2024-06-11 12:04:04.573727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:51.723 [2024-06-11 12:04:04.573797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:51.723 [2024-06-11 12:04:04.573817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.292 NEW_FUNC[1/664]: 0x4aefc0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:09:52.292 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:52.292 #7 NEW cov: 11498 ft: 11497 corp: 2/18b lim: 40 exec/s: 0 rss: 68Mb L: 17/17 MS: 5 InsertByte-ShuffleBytes-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:52.292 [2024-06-11 12:04:05.044597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.044654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.292 #8 NEW cov: 11612 ft: 12702 corp: 3/30b lim: 40 exec/s: 0 rss: 68Mb L: 12/17 MS: 1 CrossOver- 00:09:52.292 [2024-06-11 12:04:05.094761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.094799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.292 [2024-06-11 12:04:05.094869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.094888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.292 #9 NEW cov: 11618 ft: 12917 corp: 4/47b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 ChangeBinInt- 00:09:52.292 [2024-06-11 12:04:05.154911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.154945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.292 [2024-06-11 12:04:05.155014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:767676ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.155033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.292 #10 NEW cov: 11703 ft: 13116 corp: 5/64b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 CMP- DE: "\377\027"- 00:09:52.292 [2024-06-11 12:04:05.215096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.215130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.292 [2024-06-11 12:04:05.215200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76ff1776 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.215219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.292 #11 NEW cov: 11703 ft: 13266 corp: 6/81b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 PersAutoDict- DE: "\377\027"- 00:09:52.292 [2024-06-11 12:04:05.265053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.292 [2024-06-11 12:04:05.265087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.292 #12 NEW cov: 11703 ft: 13326 corp: 7/93b lim: 40 exec/s: 0 rss: 69Mb L: 12/17 MS: 1 ChangeByte- 00:09:52.600 [2024-06-11 12:04:05.325590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0f0a9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.325624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.600 [2024-06-11 12:04:05.325694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.325714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.600 [2024-06-11 12:04:05.325782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.325800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:52.600 #17 NEW cov: 11703 ft: 13635 corp: 8/123b lim: 40 exec/s: 0 rss: 69Mb L: 30/30 MS: 5 PersAutoDict-EraseBytes-CMP-EraseBytes-InsertRepeatedBytes- DE: "\377\027"-"\377\017"- 00:09:52.600 [2024-06-11 12:04:05.375404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a767624 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.375438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.600 #18 NEW cov: 11703 ft: 13659 corp: 9/136b lim: 40 exec/s: 0 rss: 69Mb L: 13/30 MS: 1 InsertByte- 00:09:52.600 [2024-06-11 12:04:05.435740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.435775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.600 [2024-06-11 12:04:05.435840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76110000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.435859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.600 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:52.600 #19 NEW cov: 11726 ft: 13697 corp: 10/153b lim: 40 exec/s: 0 rss: 69Mb L: 17/30 MS: 1 ChangeBinInt- 00:09:52.600 [2024-06-11 12:04:05.485712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a767624 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.485747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.600 #20 NEW cov: 11726 ft: 13720 corp: 11/166b lim: 40 exec/s: 0 rss: 69Mb L: 13/30 MS: 1 ChangeBit- 00:09:52.600 [2024-06-11 12:04:05.546075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:86768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.546111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.600 [2024-06-11 12:04:05.546179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:767676ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.546198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.600 #21 NEW cov: 11726 ft: 13814 corp: 12/183b lim: 40 exec/s: 21 rss: 69Mb L: 17/30 MS: 1 ChangeBinInt- 00:09:52.600 [2024-06-11 12:04:05.606088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.600 [2024-06-11 12:04:05.606123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.896 #22 NEW cov: 11726 ft: 13903 corp: 13/192b lim: 40 exec/s: 22 rss: 69Mb L: 9/30 MS: 1 EraseBytes- 00:09:52.896 [2024-06-11 12:04:05.656381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a762476 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.656416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.896 [2024-06-11 12:04:05.656479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767624 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.656499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.896 #23 NEW cov: 11726 ft: 13958 corp: 14/213b lim: 40 exec/s: 23 rss: 70Mb L: 21/30 MS: 1 CopyPart- 00:09:52.896 [2024-06-11 12:04:05.706372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a767676 cdw11:765f7676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.706406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.896 #24 NEW cov: 11726 ft: 13964 corp: 15/226b lim: 40 exec/s: 24 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:09:52.896 [2024-06-11 12:04:05.756661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.756695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.896 [2024-06-11 12:04:05.756760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:767676ef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.756780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.896 #25 NEW cov: 11726 ft: 13982 corp: 16/243b lim: 40 exec/s: 25 rss: 70Mb L: 17/30 MS: 1 ChangeBit- 00:09:52.896 [2024-06-11 12:04:05.806835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.806869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.896 [2024-06-11 12:04:05.806937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76f67676 cdw11:767676ef SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.806956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.896 #26 NEW cov: 11726 ft: 13994 corp: 17/260b lim: 40 exec/s: 26 rss: 70Mb L: 17/30 MS: 1 ChangeBit- 00:09:52.896 [2024-06-11 12:04:05.867026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:8c767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.867060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:52.896 [2024-06-11 12:04:05.867127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:768a8676 cdw11:8a8c7676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.896 [2024-06-11 12:04:05.867146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:52.896 #27 NEW cov: 11726 ft: 14032 corp: 18/283b lim: 40 exec/s: 27 rss: 70Mb L: 23/30 MS: 1 CopyPart- 00:09:53.177 [2024-06-11 12:04:05.927191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:ff177676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:05.927224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:05.927294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:767676ff cdw11:17767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:05.927314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.177 #28 NEW cov: 11726 ft: 14036 corp: 19/302b lim: 40 exec/s: 28 rss: 70Mb L: 19/30 MS: 1 PersAutoDict- DE: "\377\027"- 00:09:53.177 [2024-06-11 12:04:05.987523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:05.987556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:05.987625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76760808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:05.987644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:05.987710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:08080808 cdw11:08080808 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:05.987732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.177 #29 NEW cov: 11726 ft: 14040 corp: 20/331b lim: 40 exec/s: 29 rss: 70Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:09:53.177 [2024-06-11 12:04:06.037484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:8c768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.037518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:06.037584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.037604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.177 #35 NEW cov: 11726 ft: 14064 corp: 21/348b lim: 40 exec/s: 35 rss: 70Mb L: 17/30 MS: 1 CopyPart- 00:09:53.177 [2024-06-11 12:04:06.087829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a762476 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.087864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:06.087934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767624 cdw11:767676ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.087952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:06.088016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff76 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.088035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.177 #36 NEW cov: 11726 ft: 14103 corp: 22/377b lim: 40 exec/s: 36 rss: 70Mb L: 29/30 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:09:53.177 [2024-06-11 12:04:06.148020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0f0a9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.148054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:06.148121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.148141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.177 [2024-06-11 12:04:06.148205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9f9f9f9f cdw11:9f9f9f9f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.177 [2024-06-11 12:04:06.148224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.177 #37 NEW cov: 11726 ft: 14124 corp: 23/407b lim: 40 exec/s: 37 rss: 70Mb L: 30/30 MS: 1 CopyPart- 00:09:53.437 [2024-06-11 12:04:06.208011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:86768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.208044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.437 [2024-06-11 12:04:06.208111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:367676ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.208130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.437 #38 NEW cov: 11726 ft: 14128 corp: 24/424b lim: 40 exec/s: 38 rss: 70Mb L: 17/30 MS: 1 ChangeBit- 00:09:53.437 [2024-06-11 12:04:06.258090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.258124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.437 [2024-06-11 12:04:06.258194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:76110000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.258214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.437 #39 NEW cov: 11726 ft: 14138 corp: 25/441b lim: 40 exec/s: 39 rss: 70Mb L: 17/30 MS: 1 ShuffleBytes- 00:09:53.437 [2024-06-11 12:04:06.318094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:32767624 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.318128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.437 #40 NEW cov: 11726 ft: 14159 corp: 26/454b lim: 40 exec/s: 40 rss: 70Mb L: 13/30 MS: 1 ChangeByte- 00:09:53.437 [2024-06-11 12:04:06.368430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:86768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.368463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.437 [2024-06-11 12:04:06.368527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:ff0f7676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.368546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.437 #41 NEW cov: 11726 ft: 14169 corp: 27/473b lim: 40 exec/s: 41 rss: 70Mb L: 19/30 MS: 1 PersAutoDict- DE: "\377\017"- 00:09:53.437 [2024-06-11 12:04:06.408538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:86768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.408573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.437 [2024-06-11 12:04:06.408639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767676 cdw11:767676ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.408658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.437 #42 NEW cov: 11726 ft: 14181 corp: 28/494b lim: 40 exec/s: 42 rss: 70Mb L: 21/30 MS: 1 CMP- DE: "\000\000\001\000"- 00:09:53.437 [2024-06-11 12:04:06.448650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7676768a cdw11:8c768a8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.448684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.437 [2024-06-11 12:04:06.448751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76767e76 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.437 [2024-06-11 12:04:06.448770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.696 #43 NEW cov: 11726 ft: 14196 corp: 29/511b lim: 40 exec/s: 43 rss: 70Mb L: 17/30 MS: 1 ChangeBit- 00:09:53.696 [2024-06-11 12:04:06.509223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:bdffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.696 [2024-06-11 12:04:06.509256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.696 [2024-06-11 12:04:06.509327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.696 [2024-06-11 12:04:06.509347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.696 [2024-06-11 12:04:06.509416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.696 [2024-06-11 12:04:06.509435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:53.696 [2024-06-11 12:04:06.509501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.696 [2024-06-11 12:04:06.509520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:53.696 #45 NEW cov: 11726 ft: 14506 corp: 30/546b lim: 40 exec/s: 45 rss: 70Mb L: 35/35 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:53.696 [2024-06-11 12:04:06.559007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:76767676 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.696 [2024-06-11 12:04:06.559042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:53.696 [2024-06-11 12:04:06.559111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:76ff1776 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.696 [2024-06-11 12:04:06.559130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:53.696 #46 NEW cov: 11726 ft: 14521 corp: 31/563b lim: 40 exec/s: 23 rss: 70Mb L: 17/35 MS: 1 ChangeBit- 00:09:53.696 #46 DONE cov: 11726 ft: 14521 corp: 31/563b lim: 40 exec/s: 23 rss: 70Mb 00:09:53.696 ###### Recommended dictionary. ###### 00:09:53.696 "\377\027" # Uses: 3 00:09:53.696 "\377\017" # Uses: 1 00:09:53.696 "\377\377\377\377\377\377\377\377" # Uses: 0 00:09:53.696 "\000\000\001\000" # Uses: 0 00:09:53.696 ###### End of recommended dictionary. ###### 00:09:53.696 Done 46 runs in 2 second(s) 00:09:53.696 12:04:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:09:53.696 12:04:06 -- ../common.sh@72 -- # (( i++ )) 00:09:53.696 12:04:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:53.696 12:04:06 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:09:53.696 12:04:06 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:09:53.696 12:04:06 -- nvmf/run.sh@24 -- # local timen=1 00:09:53.696 12:04:06 -- nvmf/run.sh@25 -- # local core=0x1 00:09:53.696 12:04:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:53.696 12:04:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:09:53.696 12:04:06 -- nvmf/run.sh@29 -- # printf %02d 13 00:09:53.696 12:04:06 -- nvmf/run.sh@29 -- # port=4413 00:09:53.696 12:04:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:53.956 12:04:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:09:53.956 12:04:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:53.956 12:04:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:09:53.956 [2024-06-11 12:04:06.764976] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:53.956 [2024-06-11 12:04:06.765058] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2695583 ] 00:09:53.956 EAL: No free 2048 kB hugepages reported on node 1 00:09:54.215 [2024-06-11 12:04:07.020794] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.215 [2024-06-11 12:04:07.047499] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:54.215 [2024-06-11 12:04:07.047678] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.215 [2024-06-11 12:04:07.102517] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:54.215 [2024-06-11 12:04:07.118759] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:09:54.215 INFO: Running with entropic power schedule (0xFF, 100). 00:09:54.215 INFO: Seed: 1730244792 00:09:54.215 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:54.215 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:54.215 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:54.215 INFO: A corpus is not provided, starting from an empty corpus 00:09:54.215 #2 INITED exec/s: 0 rss: 61Mb 00:09:54.215 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:54.215 This may also happen if the target rejected all inputs we tried so far 00:09:54.215 [2024-06-11 12:04:07.197627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.215 [2024-06-11 12:04:07.197678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.215 [2024-06-11 12:04:07.197795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.215 [2024-06-11 12:04:07.197818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.215 [2024-06-11 12:04:07.197930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.215 [2024-06-11 12:04:07.197952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.215 [2024-06-11 12:04:07.198070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.215 [2024-06-11 12:04:07.198091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.215 [2024-06-11 12:04:07.198213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.215 [2024-06-11 12:04:07.198235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:54.783 NEW_FUNC[1/662]: 0x4b0b80 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:09:54.783 NEW_FUNC[2/662]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:54.783 #5 NEW cov: 11475 ft: 11476 corp: 2/41b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:09:54.783 [2024-06-11 12:04:07.688171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.688221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.688325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.688351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.688451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.688472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.688564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.688585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.688687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.688711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:54.783 NEW_FUNC[1/1]: 0x19c97a0 in sock_group_impl_poll_count /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/sock/sock.c:710 00:09:54.783 #6 NEW cov: 11600 ft: 11943 corp: 3/81b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ChangeByte- 00:09:54.783 [2024-06-11 12:04:07.758577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.758620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.758720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.758741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.758840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.758861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.758954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff2cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.758975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:54.783 [2024-06-11 12:04:07.759068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:54.783 [2024-06-11 12:04:07.759088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:54.783 #7 NEW cov: 11606 ft: 12171 corp: 4/121b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ChangeByte- 00:09:55.043 [2024-06-11 12:04:07.828679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.828715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.828808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.828830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.828928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.828956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.829043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.829064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.829163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ff7fffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.829184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.043 #8 NEW cov: 11691 ft: 12466 corp: 5/161b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:09:55.043 [2024-06-11 12:04:07.888233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.888270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.888377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.888398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.043 #13 NEW cov: 11691 ft: 13157 corp: 6/183b lim: 40 exec/s: 0 rss: 69Mb L: 22/40 MS: 5 CrossOver-ChangeBit-ChangeBit-CopyPart-InsertRepeatedBytes- 00:09:55.043 [2024-06-11 12:04:07.949534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.949568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.949670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.949690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.949783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.949803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.949903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.949924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:07.950020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:977fffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:07.950040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.043 #14 NEW cov: 11691 ft: 13194 corp: 7/223b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:09:55.043 [2024-06-11 12:04:08.019713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:08.019748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:08.019854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:08.019877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:08.019974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:08.019995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:08.020088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:08.020109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.043 [2024-06-11 12:04:08.020201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.043 [2024-06-11 12:04:08.020222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.043 #15 NEW cov: 11691 ft: 13266 corp: 8/263b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:09:55.303 [2024-06-11 12:04:08.079300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.079336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.079437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.079461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.079554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.079576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.303 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:55.303 #16 NEW cov: 11714 ft: 13536 corp: 9/287b lim: 40 exec/s: 0 rss: 69Mb L: 24/40 MS: 1 EraseBytes- 00:09:55.303 [2024-06-11 12:04:08.150260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.150295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.150399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fbffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.150421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.150517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.150537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.150632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.150653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.150759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ff7fffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.150780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.303 #17 NEW cov: 11714 ft: 13577 corp: 10/327b lim: 40 exec/s: 17 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:09:55.303 [2024-06-11 12:04:08.210489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.210524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.210624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.210646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.210741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.210761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.210863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff2cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.210884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.210979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.211000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.303 #18 NEW cov: 11714 ft: 13611 corp: 11/367b lim: 40 exec/s: 18 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:09:55.303 [2024-06-11 12:04:08.280807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.280840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.280939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.280959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.281052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.281072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.281170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.281191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.303 [2024-06-11 12:04:08.281291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:7fffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.303 [2024-06-11 12:04:08.281312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.303 #19 NEW cov: 11714 ft: 13632 corp: 12/407b lim: 40 exec/s: 19 rss: 69Mb L: 40/40 MS: 1 ShuffleBytes- 00:09:55.563 [2024-06-11 12:04:08.341032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.341067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.341167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff3aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.341189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.341281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.341302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.341407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff2cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.341430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.341530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.341551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.563 #20 NEW cov: 11714 ft: 13658 corp: 13/447b lim: 40 exec/s: 20 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:09:55.563 [2024-06-11 12:04:08.401543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.401578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.401679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.401699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.401796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.401816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.401910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.401931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.402031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:0080ffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.402052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.563 #21 NEW cov: 11714 ft: 13675 corp: 14/487b lim: 40 exec/s: 21 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:09:55.563 [2024-06-11 12:04:08.460874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.460914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.461008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.461030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.563 #22 NEW cov: 11714 ft: 13689 corp: 15/509b lim: 40 exec/s: 22 rss: 69Mb L: 22/40 MS: 1 ChangeByte- 00:09:55.563 [2024-06-11 12:04:08.531194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.531228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.563 [2024-06-11 12:04:08.531320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.563 [2024-06-11 12:04:08.531343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.563 #23 NEW cov: 11714 ft: 13694 corp: 16/527b lim: 40 exec/s: 23 rss: 69Mb L: 18/40 MS: 1 EraseBytes- 00:09:55.822 [2024-06-11 12:04:08.601567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.601603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.822 [2024-06-11 12:04:08.601695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.601717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.822 #24 NEW cov: 11714 ft: 13761 corp: 17/549b lim: 40 exec/s: 24 rss: 69Mb L: 22/40 MS: 1 EraseBytes- 00:09:55.822 [2024-06-11 12:04:08.662034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff04 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.662069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.822 [2024-06-11 12:04:08.662169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff3a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.662191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.822 [2024-06-11 12:04:08.662284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.662306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.822 #25 NEW cov: 11714 ft: 13827 corp: 18/573b lim: 40 exec/s: 25 rss: 69Mb L: 24/40 MS: 1 CMP- DE: "\377\004"- 00:09:55.822 [2024-06-11 12:04:08.723018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.723052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.822 [2024-06-11 12:04:08.723153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff7e cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.822 [2024-06-11 12:04:08.723172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.723270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.723295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.723405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.723427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.723522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.723543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.823 #26 NEW cov: 11714 ft: 13874 corp: 19/613b lim: 40 exec/s: 26 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:09:55.823 [2024-06-11 12:04:08.783145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.783181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.783287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.783308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.783409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffff56ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.783430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.783529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff7fffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.783550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.783645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ff7fffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.783666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:55.823 #27 NEW cov: 11714 ft: 13898 corp: 20/653b lim: 40 exec/s: 27 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:09:55.823 [2024-06-11 12:04:08.852938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.852971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.853071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.853093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:55.823 [2024-06-11 12:04:08.853182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:55.823 [2024-06-11 12:04:08.853203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.082 #28 NEW cov: 11714 ft: 13908 corp: 21/677b lim: 40 exec/s: 28 rss: 69Mb L: 24/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:09:56.082 [2024-06-11 12:04:08.912632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:08.912667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.082 #29 NEW cov: 11714 ft: 14249 corp: 22/692b lim: 40 exec/s: 29 rss: 69Mb L: 15/40 MS: 1 EraseBytes- 00:09:56.082 [2024-06-11 12:04:08.974158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:08.974192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:08.974260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff3aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:08.974280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:08.974385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff56ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:08.974406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:08.974495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffff3a cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:08.974516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:08.974613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:08.974633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:56.082 #30 NEW cov: 11714 ft: 14266 corp: 23/732b lim: 40 exec/s: 30 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:09:56.082 [2024-06-11 12:04:09.043613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:09.043647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:09.043742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:09.043764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.082 #31 NEW cov: 11714 ft: 14273 corp: 24/754b lim: 40 exec/s: 31 rss: 70Mb L: 22/40 MS: 1 ChangeBit- 00:09:56.082 [2024-06-11 12:04:09.104196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:09.104230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:09.104332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000020 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:09.104354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.082 [2024-06-11 12:04:09.104458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffdffdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.082 [2024-06-11 12:04:09.104482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.342 #32 NEW cov: 11714 ft: 14284 corp: 25/778b lim: 40 exec/s: 32 rss: 70Mb L: 24/40 MS: 1 PersAutoDict- DE: "\377\004"- 00:09:56.343 [2024-06-11 12:04:09.175218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:60ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.343 [2024-06-11 12:04:09.175252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:56.343 [2024-06-11 12:04:09.175347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.343 [2024-06-11 12:04:09.175374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.343 [2024-06-11 12:04:09.175468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.343 [2024-06-11 12:04:09.175489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.343 [2024-06-11 12:04:09.175587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.343 [2024-06-11 12:04:09.175608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:56.343 [2024-06-11 12:04:09.175704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffa4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:56.343 [2024-06-11 12:04:09.175725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:56.343 #33 NEW cov: 11714 ft: 14291 corp: 26/818b lim: 40 exec/s: 16 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:09:56.343 #33 DONE cov: 11714 ft: 14291 corp: 26/818b lim: 40 exec/s: 16 rss: 70Mb 00:09:56.343 ###### Recommended dictionary. ###### 00:09:56.343 "\377\004" # Uses: 1 00:09:56.343 "\001\000\000\000\000\000\000\000" # Uses: 0 00:09:56.343 ###### End of recommended dictionary. ###### 00:09:56.343 Done 33 runs in 2 second(s) 00:09:56.343 12:04:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:09:56.343 12:04:09 -- ../common.sh@72 -- # (( i++ )) 00:09:56.343 12:04:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:56.343 12:04:09 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:09:56.343 12:04:09 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:09:56.343 12:04:09 -- nvmf/run.sh@24 -- # local timen=1 00:09:56.343 12:04:09 -- nvmf/run.sh@25 -- # local core=0x1 00:09:56.343 12:04:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:56.343 12:04:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:09:56.343 12:04:09 -- nvmf/run.sh@29 -- # printf %02d 14 00:09:56.343 12:04:09 -- nvmf/run.sh@29 -- # port=4414 00:09:56.343 12:04:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:56.343 12:04:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:09:56.343 12:04:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:56.343 12:04:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:09:56.343 [2024-06-11 12:04:09.374221] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:56.343 [2024-06-11 12:04:09.374290] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2695943 ] 00:09:56.602 EAL: No free 2048 kB hugepages reported on node 1 00:09:56.602 [2024-06-11 12:04:09.630063] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:56.861 [2024-06-11 12:04:09.656951] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:56.861 [2024-06-11 12:04:09.657123] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:56.861 [2024-06-11 12:04:09.711980] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:56.861 [2024-06-11 12:04:09.728219] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:09:56.861 INFO: Running with entropic power schedule (0xFF, 100). 00:09:56.861 INFO: Seed: 43276538 00:09:56.861 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:56.861 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:56.861 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:56.861 INFO: A corpus is not provided, starting from an empty corpus 00:09:56.861 #2 INITED exec/s: 0 rss: 61Mb 00:09:56.861 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:56.861 This may also happen if the target rejected all inputs we tried so far 00:09:56.861 [2024-06-11 12:04:09.797590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:56.861 [2024-06-11 12:04:09.797634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:56.861 [2024-06-11 12:04:09.797733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:56.861 [2024-06-11 12:04:09.797751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:56.861 [2024-06-11 12:04:09.797851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:56.861 [2024-06-11 12:04:09.797869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:56.861 [2024-06-11 12:04:09.797964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:56.861 [2024-06-11 12:04:09.797983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:57.429 NEW_FUNC[1/666]: 0x4b2740 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:09:57.429 NEW_FUNC[2/666]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:09:57.429 #6 NEW cov: 11514 ft: 11515 corp: 2/36b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 4 CopyPart-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:09:57.429 [2024-06-11 12:04:10.268513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.268565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.429 [2024-06-11 12:04:10.268664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.268687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.429 [2024-06-11 12:04:10.268786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.268807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.429 [2024-06-11 12:04:10.268907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.268935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:57.429 #7 NEW cov: 11634 ft: 12115 corp: 3/71b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:57.429 [2024-06-11 12:04:10.328530] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.328564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.429 [2024-06-11 12:04:10.328658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.328676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.429 [2024-06-11 12:04:10.328778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.429 [2024-06-11 12:04:10.328796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.429 [2024-06-11 12:04:10.328896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.328912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.430 [2024-06-11 12:04:10.329005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.329023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:57.430 #8 NEW cov: 11640 ft: 12358 corp: 4/106b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:57.430 [2024-06-11 12:04:10.388862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.388890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.430 [2024-06-11 12:04:10.388985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.389003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.430 [2024-06-11 12:04:10.389093] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.389108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.430 [2024-06-11 12:04:10.389203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.389219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:57.430 #9 NEW cov: 11725 ft: 12685 corp: 5/141b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:09:57.430 [2024-06-11 12:04:10.437788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.437817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.430 [2024-06-11 12:04:10.437908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.430 [2024-06-11 12:04:10.437926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.430 #11 NEW cov: 11725 ft: 13153 corp: 6/155b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:57.689 [2024-06-11 12:04:10.489232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.489261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.489354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.489377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.489479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.489496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.489607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.489625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:57.689 #12 NEW cov: 11725 ft: 13211 corp: 7/190b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:57.689 [2024-06-11 12:04:10.538249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.538278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.689 #13 NEW cov: 11725 ft: 13442 corp: 8/208b lim: 35 exec/s: 0 rss: 69Mb L: 18/35 MS: 1 EraseBytes- 00:09:57.689 [2024-06-11 12:04:10.588453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.588483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.588579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.588597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.689 #14 NEW cov: 11725 ft: 13523 corp: 9/222b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 1 ChangeByte- 00:09:57.689 [2024-06-11 12:04:10.650152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.650179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.650271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.650288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.650387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.650403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.650501] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000f9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.650520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:57.689 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:57.689 #15 NEW cov: 11748 ft: 13584 corp: 10/257b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:57.689 [2024-06-11 12:04:10.709147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.709174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.689 [2024-06-11 12:04:10.709270] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.689 [2024-06-11 12:04:10.709290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.948 #16 NEW cov: 11748 ft: 13631 corp: 11/271b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 1 ChangeByte- 00:09:57.948 [2024-06-11 12:04:10.770242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.770268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.770363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.770380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.770473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.770489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.770581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.770599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:57.948 #17 NEW cov: 11748 ft: 13668 corp: 12/303b lim: 35 exec/s: 17 rss: 69Mb L: 32/35 MS: 1 EraseBytes- 00:09:57.948 [2024-06-11 12:04:10.829616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.829643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.829747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.829767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.948 #18 NEW cov: 11748 ft: 13695 corp: 13/323b lim: 35 exec/s: 18 rss: 69Mb L: 20/35 MS: 1 InsertRepeatedBytes- 00:09:57.948 [2024-06-11 12:04:10.890333] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000026 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.890364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.890460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.890477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.890569] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.890586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.948 #21 NEW cov: 11748 ft: 13861 corp: 14/349b lim: 35 exec/s: 21 rss: 69Mb L: 26/35 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:09:57.948 [2024-06-11 12:04:10.940577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.948 [2024-06-11 12:04:10.940609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:57.948 [2024-06-11 12:04:10.940698] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.949 [2024-06-11 12:04:10.940718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:57.949 [2024-06-11 12:04:10.940816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:57.949 [2024-06-11 12:04:10.940834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:57.949 #22 NEW cov: 11748 ft: 13930 corp: 15/371b lim: 35 exec/s: 22 rss: 69Mb L: 22/35 MS: 1 CMP- DE: "\377\005"- 00:09:58.207 [2024-06-11 12:04:11.000910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.000940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.001029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.001048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.208 #23 NEW cov: 11748 ft: 13942 corp: 16/396b lim: 35 exec/s: 23 rss: 69Mb L: 25/35 MS: 1 CrossOver- 00:09:58.208 [2024-06-11 12:04:11.050331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.050368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.208 #24 NEW cov: 11748 ft: 14513 corp: 17/408b lim: 35 exec/s: 24 rss: 69Mb L: 12/35 MS: 1 EraseBytes- 00:09:58.208 [2024-06-11 12:04:11.101253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.101283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.101381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.101400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.208 #25 NEW cov: 11748 ft: 14526 corp: 18/422b lim: 35 exec/s: 25 rss: 69Mb L: 14/35 MS: 1 ChangeByte- 00:09:58.208 [2024-06-11 12:04:11.152512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.152540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.152632] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.152649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.152739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.152757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.152851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.152871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.152962] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.152980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.208 #26 NEW cov: 11748 ft: 14629 corp: 19/457b lim: 35 exec/s: 26 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:09:58.208 [2024-06-11 12:04:11.201975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.202003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.202100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.202117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.208 [2024-06-11 12:04:11.202203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.208 [2024-06-11 12:04:11.202220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.208 #27 NEW cov: 11748 ft: 14646 corp: 20/479b lim: 35 exec/s: 27 rss: 70Mb L: 22/35 MS: 1 ShuffleBytes- 00:09:58.467 [2024-06-11 12:04:11.261491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.261522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.467 #28 NEW cov: 11748 ft: 14665 corp: 21/486b lim: 35 exec/s: 28 rss: 70Mb L: 7/35 MS: 1 EraseBytes- 00:09:58.467 [2024-06-11 12:04:11.323227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.323254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.323350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.323371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.323474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.323489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.323581] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.323597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.467 #29 NEW cov: 11748 ft: 14704 corp: 22/521b lim: 35 exec/s: 29 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:09:58.467 [2024-06-11 12:04:11.383665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.383692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.383780] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.383796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.383885] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.383900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.383988] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.384006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.467 #30 NEW cov: 11748 ft: 14784 corp: 23/556b lim: 35 exec/s: 30 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:09:58.467 [2024-06-11 12:04:11.433508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.433534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.433625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.433640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.433728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.433744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.467 #31 NEW cov: 11748 ft: 14876 corp: 24/586b lim: 35 exec/s: 31 rss: 70Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:09:58.467 [2024-06-11 12:04:11.494026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.494053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.494143] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.494158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.494246] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.494263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.467 [2024-06-11 12:04:11.494354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.467 [2024-06-11 12:04:11.494378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.727 #32 NEW cov: 11748 ft: 14891 corp: 25/621b lim: 35 exec/s: 32 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:09:58.727 [2024-06-11 12:04:11.553471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.553509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.553595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.553613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.553711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.553729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.727 #33 NEW cov: 11748 ft: 14900 corp: 26/644b lim: 35 exec/s: 33 rss: 70Mb L: 23/35 MS: 1 EraseBytes- 00:09:58.727 [2024-06-11 12:04:11.604439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.604467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.604558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.604575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.604660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.604675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.604771] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.604789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.727 #34 NEW cov: 11748 ft: 14931 corp: 27/679b lim: 35 exec/s: 34 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:09:58.727 [2024-06-11 12:04:11.654591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.654618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.654713] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.654730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.654817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000002a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.654833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.654934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.654955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.727 #35 NEW cov: 11748 ft: 14938 corp: 28/714b lim: 35 exec/s: 35 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:09:58.727 [2024-06-11 12:04:11.703239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000c2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.703269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:58.727 #36 NEW cov: 11748 ft: 14945 corp: 29/726b lim: 35 exec/s: 36 rss: 70Mb L: 12/35 MS: 1 EraseBytes- 00:09:58.727 [2024-06-11 12:04:11.755010] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.755037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.755125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.755141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.755234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.755253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:58.727 [2024-06-11 12:04:11.755349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:58.727 [2024-06-11 12:04:11.755371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:58.987 #37 NEW cov: 11748 ft: 14962 corp: 30/761b lim: 35 exec/s: 18 rss: 70Mb L: 35/35 MS: 1 PersAutoDict- DE: "\377\005"- 00:09:58.987 #37 DONE cov: 11748 ft: 14962 corp: 30/761b lim: 35 exec/s: 18 rss: 70Mb 00:09:58.987 ###### Recommended dictionary. ###### 00:09:58.987 "\377\005" # Uses: 1 00:09:58.987 ###### End of recommended dictionary. ###### 00:09:58.987 Done 37 runs in 2 second(s) 00:09:58.987 12:04:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:09:58.987 12:04:11 -- ../common.sh@72 -- # (( i++ )) 00:09:58.987 12:04:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:58.987 12:04:11 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:09:58.987 12:04:11 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:09:58.987 12:04:11 -- nvmf/run.sh@24 -- # local timen=1 00:09:58.987 12:04:11 -- nvmf/run.sh@25 -- # local core=0x1 00:09:58.987 12:04:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:58.987 12:04:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:09:58.987 12:04:11 -- nvmf/run.sh@29 -- # printf %02d 15 00:09:58.987 12:04:11 -- nvmf/run.sh@29 -- # port=4415 00:09:58.987 12:04:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:58.987 12:04:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:09:58.987 12:04:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:58.987 12:04:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:09:58.987 [2024-06-11 12:04:11.941623] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:09:58.987 [2024-06-11 12:04:11.941701] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2696309 ] 00:09:58.987 EAL: No free 2048 kB hugepages reported on node 1 00:09:59.251 [2024-06-11 12:04:12.181338] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:59.251 [2024-06-11 12:04:12.207393] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:09:59.251 [2024-06-11 12:04:12.207564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:59.251 [2024-06-11 12:04:12.262030] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:59.251 [2024-06-11 12:04:12.278278] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:09:59.512 INFO: Running with entropic power schedule (0xFF, 100). 00:09:59.512 INFO: Seed: 2593278195 00:09:59.512 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:09:59.512 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:09:59.512 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:59.512 INFO: A corpus is not provided, starting from an empty corpus 00:09:59.512 #2 INITED exec/s: 0 rss: 61Mb 00:09:59.512 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:59.512 This may also happen if the target rejected all inputs we tried so far 00:09:59.512 [2024-06-11 12:04:12.333970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:59.512 [2024-06-11 12:04:12.334015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:59.512 [2024-06-11 12:04:12.334088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:59.512 [2024-06-11 12:04:12.334108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:59.770 NEW_FUNC[1/663]: 0x4b3c80 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:09:59.770 NEW_FUNC[2/663]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:59.770 #5 NEW cov: 11469 ft: 11470 corp: 2/15b lim: 35 exec/s: 0 rss: 68Mb L: 14/14 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:10:00.030 [2024-06-11 12:04:12.805336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.805386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.805460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.805480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.805549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.805567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.030 #8 NEW cov: 11582 ft: 12240 corp: 3/38b lim: 35 exec/s: 0 rss: 68Mb L: 23/23 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:10:00.030 [2024-06-11 12:04:12.855314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.855349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.855427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.855447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.855515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.855534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.030 #9 NEW cov: 11588 ft: 12355 corp: 4/61b lim: 35 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeBit- 00:10:00.030 [2024-06-11 12:04:12.915507] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007fb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.915541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.915611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.915631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.915699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.915717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.030 #10 NEW cov: 11673 ft: 12583 corp: 5/84b lim: 35 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 ChangeBit- 00:10:00.030 [2024-06-11 12:04:12.975564] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.975598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:12.975672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:12.975692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.030 #11 NEW cov: 11673 ft: 12775 corp: 6/99b lim: 35 exec/s: 0 rss: 69Mb L: 15/23 MS: 1 InsertByte- 00:10:00.030 [2024-06-11 12:04:13.035882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:13.035916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:13.035989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:13.036009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.030 [2024-06-11 12:04:13.036078] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.030 [2024-06-11 12:04:13.036097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.290 #12 NEW cov: 11673 ft: 12857 corp: 7/122b lim: 35 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 ShuffleBytes- 00:10:00.290 [2024-06-11 12:04:13.086003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.086038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.086109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.086129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.086199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.086219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.290 #13 NEW cov: 11673 ft: 12990 corp: 8/145b lim: 35 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 CMP- DE: "\001\000"- 00:10:00.290 [2024-06-11 12:04:13.136455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.136488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.136548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.136569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.136641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.136659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.136731] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.136751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.136824] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.136843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:00.290 #16 NEW cov: 11673 ft: 13482 corp: 9/180b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:10:00.290 [2024-06-11 12:04:13.186211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007fb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.186244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.186319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.186339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.186411] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.186431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.290 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:00.290 #22 NEW cov: 11696 ft: 13558 corp: 10/203b lim: 35 exec/s: 0 rss: 69Mb L: 23/35 MS: 1 ChangeBinInt- 00:10:00.290 [2024-06-11 12:04:13.246287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.246320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.246393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.246413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.290 #23 NEW cov: 11696 ft: 13609 corp: 11/218b lim: 35 exec/s: 0 rss: 69Mb L: 15/35 MS: 1 ShuffleBytes- 00:10:00.290 [2024-06-11 12:04:13.306753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.306786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.306860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.306879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.306950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000122 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.306968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.290 [2024-06-11 12:04:13.307035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.290 [2024-06-11 12:04:13.307054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.549 #24 NEW cov: 11696 ft: 13728 corp: 12/248b lim: 35 exec/s: 24 rss: 69Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:10:00.549 [2024-06-11 12:04:13.366670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.366704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.549 [2024-06-11 12:04:13.366783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.366802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.549 #25 NEW cov: 11696 ft: 13740 corp: 13/263b lim: 35 exec/s: 25 rss: 69Mb L: 15/35 MS: 1 ChangeByte- 00:10:00.549 [2024-06-11 12:04:13.427269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.427303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.549 [2024-06-11 12:04:13.427380] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.427399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.549 [2024-06-11 12:04:13.427470] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.427489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.549 [2024-06-11 12:04:13.427559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.427578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.549 [2024-06-11 12:04:13.427648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.549 [2024-06-11 12:04:13.427668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:00.549 #26 NEW cov: 11696 ft: 13776 corp: 14/298b lim: 35 exec/s: 26 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:10:00.549 [2024-06-11 12:04:13.487104] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.487138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.550 [2024-06-11 12:04:13.487213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.487232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.550 [2024-06-11 12:04:13.487301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.487320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.550 #27 NEW cov: 11696 ft: 13811 corp: 15/321b lim: 35 exec/s: 27 rss: 69Mb L: 23/35 MS: 1 PersAutoDict- DE: "\001\000"- 00:10:00.550 [2024-06-11 12:04:13.537522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.537556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.550 [2024-06-11 12:04:13.537627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.537647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.550 [2024-06-11 12:04:13.537716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.537739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.550 [2024-06-11 12:04:13.537807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.537826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.550 [2024-06-11 12:04:13.537897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.550 [2024-06-11 12:04:13.537916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:00.550 #28 NEW cov: 11696 ft: 13830 corp: 16/356b lim: 35 exec/s: 28 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:10:00.809 [2024-06-11 12:04:13.587697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.809 [2024-06-11 12:04:13.587731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.809 [2024-06-11 12:04:13.587802] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.809 [2024-06-11 12:04:13.587822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.809 [2024-06-11 12:04:13.587889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.587908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.587974] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.587993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.588065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.588083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:00.810 #29 NEW cov: 11696 ft: 13853 corp: 17/391b lim: 35 exec/s: 29 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:10:00.810 [2024-06-11 12:04:13.647851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.647885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.647959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.647979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.648050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.648069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.648142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.648161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.648229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.648252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:00.810 #30 NEW cov: 11696 ft: 13869 corp: 18/426b lim: 35 exec/s: 30 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:10:00.810 [2024-06-11 12:04:13.697678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.697712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.697786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.697806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.697878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.697896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.810 #31 NEW cov: 11696 ft: 13878 corp: 19/449b lim: 35 exec/s: 31 rss: 70Mb L: 23/35 MS: 1 CrossOver- 00:10:00.810 [2024-06-11 12:04:13.757948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.757983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.758056] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000001f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.758076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.810 NEW_FUNC[1/1]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:10:00.810 #33 NEW cov: 11710 ft: 13913 corp: 20/473b lim: 35 exec/s: 33 rss: 70Mb L: 24/35 MS: 2 ShuffleBytes-CrossOver- 00:10:00.810 [2024-06-11 12:04:13.808160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.808195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.808269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.808289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.808365] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.808384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:00.810 [2024-06-11 12:04:13.808457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:00.810 [2024-06-11 12:04:13.808475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:00.810 #34 NEW cov: 11710 ft: 13928 corp: 21/506b lim: 35 exec/s: 34 rss: 70Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:10:01.069 [2024-06-11 12:04:13.858039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:13.858075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.069 #35 NEW cov: 11710 ft: 14067 corp: 22/526b lim: 35 exec/s: 35 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:10:01.069 [2024-06-11 12:04:13.918208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:13.918249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.069 #36 NEW cov: 11710 ft: 14115 corp: 23/546b lim: 35 exec/s: 36 rss: 70Mb L: 20/35 MS: 1 ChangeBinInt- 00:10:01.069 [2024-06-11 12:04:13.978332] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:13.978372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:13.978449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:13.978469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.069 #37 NEW cov: 11710 ft: 14136 corp: 24/561b lim: 35 exec/s: 37 rss: 70Mb L: 15/35 MS: 1 ChangeByte- 00:10:01.069 [2024-06-11 12:04:14.028964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.028997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.029068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.029087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.029158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.029177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.029246] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.029265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.029334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.029353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:01.069 #38 NEW cov: 11710 ft: 14139 corp: 25/596b lim: 35 exec/s: 38 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\000\000\177\211\250\000\034'"- 00:10:01.069 [2024-06-11 12:04:14.079097] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.079130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.079199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.079219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.079286] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.079305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.079375] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.079394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:10:01.069 [2024-06-11 12:04:14.079473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.069 [2024-06-11 12:04:14.079492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:10:01.329 #39 NEW cov: 11710 ft: 14212 corp: 26/631b lim: 35 exec/s: 39 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:10:01.329 [2024-06-11 12:04:14.128835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.128869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.329 #40 NEW cov: 11710 ft: 14215 corp: 27/651b lim: 35 exec/s: 40 rss: 70Mb L: 20/35 MS: 1 CMP- DE: "\000\000\000\001"- 00:10:01.329 [2024-06-11 12:04:14.189103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.189138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.329 [2024-06-11 12:04:14.189211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.189231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.329 [2024-06-11 12:04:14.189303] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.189321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.329 #41 NEW cov: 11710 ft: 14222 corp: 28/675b lim: 35 exec/s: 41 rss: 70Mb L: 24/35 MS: 1 CrossOver- 00:10:01.329 [2024-06-11 12:04:14.239266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.239299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.329 [2024-06-11 12:04:14.239377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.239398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.329 #42 NEW cov: 11710 ft: 14258 corp: 29/699b lim: 35 exec/s: 42 rss: 70Mb L: 24/35 MS: 1 CopyPart- 00:10:01.329 [2024-06-11 12:04:14.289337] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.289376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:10:01.329 [2024-06-11 12:04:14.289448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.289467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:10:01.329 [2024-06-11 12:04:14.289537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:01.329 [2024-06-11 12:04:14.289556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:10:01.329 #43 NEW cov: 11710 ft: 14266 corp: 30/722b lim: 35 exec/s: 21 rss: 70Mb L: 23/35 MS: 1 ChangeBit- 00:10:01.329 #43 DONE cov: 11710 ft: 14266 corp: 30/722b lim: 35 exec/s: 21 rss: 70Mb 00:10:01.329 ###### Recommended dictionary. ###### 00:10:01.329 "\001\000" # Uses: 1 00:10:01.329 "\000\000\177\211\250\000\034'" # Uses: 0 00:10:01.329 "\000\000\000\001" # Uses: 0 00:10:01.329 ###### End of recommended dictionary. ###### 00:10:01.329 Done 43 runs in 2 second(s) 00:10:01.588 12:04:14 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:10:01.588 12:04:14 -- ../common.sh@72 -- # (( i++ )) 00:10:01.588 12:04:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:01.588 12:04:14 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:10:01.588 12:04:14 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:10:01.588 12:04:14 -- nvmf/run.sh@24 -- # local timen=1 00:10:01.588 12:04:14 -- nvmf/run.sh@25 -- # local core=0x1 00:10:01.588 12:04:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:10:01.588 12:04:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:10:01.588 12:04:14 -- nvmf/run.sh@29 -- # printf %02d 16 00:10:01.588 12:04:14 -- nvmf/run.sh@29 -- # port=4416 00:10:01.588 12:04:14 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:10:01.588 12:04:14 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:10:01.588 12:04:14 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:01.588 12:04:14 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:10:01.588 [2024-06-11 12:04:14.497884] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:01.588 [2024-06-11 12:04:14.497955] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2696672 ] 00:10:01.588 EAL: No free 2048 kB hugepages reported on node 1 00:10:01.847 [2024-06-11 12:04:14.744297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:01.847 [2024-06-11 12:04:14.770383] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:01.847 [2024-06-11 12:04:14.770554] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:01.848 [2024-06-11 12:04:14.824977] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:01.848 [2024-06-11 12:04:14.841217] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:10:01.848 INFO: Running with entropic power schedule (0xFF, 100). 00:10:01.848 INFO: Seed: 860303047 00:10:01.848 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:01.848 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:01.848 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:10:01.848 INFO: A corpus is not provided, starting from an empty corpus 00:10:01.848 #2 INITED exec/s: 0 rss: 61Mb 00:10:01.848 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:01.848 This may also happen if the target rejected all inputs we tried so far 00:10:02.107 [2024-06-11 12:04:14.890350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.107 [2024-06-11 12:04:14.890397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.107 [2024-06-11 12:04:14.890454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.107 [2024-06-11 12:04:14.890475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.107 [2024-06-11 12:04:14.890546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.107 [2024-06-11 12:04:14.890567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.107 [2024-06-11 12:04:14.890632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.107 [2024-06-11 12:04:14.890657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.366 NEW_FUNC[1/664]: 0x4b5130 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:10:02.366 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:02.366 #39 NEW cov: 11572 ft: 11573 corp: 2/97b lim: 105 exec/s: 0 rss: 68Mb L: 96/96 MS: 2 CrossOver-InsertRepeatedBytes- 00:10:02.366 [2024-06-11 12:04:15.361396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.366 [2024-06-11 12:04:15.361445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.366 [2024-06-11 12:04:15.361488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.366 [2024-06-11 12:04:15.361509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.366 [2024-06-11 12:04:15.361574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.366 [2024-06-11 12:04:15.361595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.366 [2024-06-11 12:04:15.361657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.366 [2024-06-11 12:04:15.361678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.625 #40 NEW cov: 11685 ft: 12111 corp: 3/199b lim: 105 exec/s: 0 rss: 68Mb L: 102/102 MS: 1 InsertRepeatedBytes- 00:10:02.625 [2024-06-11 12:04:15.421481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.421520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.421573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.421594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.421659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.421680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.421742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.421764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.625 #41 NEW cov: 11691 ft: 12377 corp: 4/295b lim: 105 exec/s: 0 rss: 68Mb L: 96/102 MS: 1 ChangeBinInt- 00:10:02.625 [2024-06-11 12:04:15.471630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.471668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.471726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.471751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.471815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.471836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.471897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.471918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.625 #42 NEW cov: 11776 ft: 12550 corp: 5/391b lim: 105 exec/s: 0 rss: 68Mb L: 96/102 MS: 1 ChangeBinInt- 00:10:02.625 [2024-06-11 12:04:15.521511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.521548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.625 [2024-06-11 12:04:15.521594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.625 [2024-06-11 12:04:15.521616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.626 #43 NEW cov: 11776 ft: 13139 corp: 6/453b lim: 105 exec/s: 0 rss: 69Mb L: 62/102 MS: 1 EraseBytes- 00:10:02.626 [2024-06-11 12:04:15.581483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721391328102408 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.626 [2024-06-11 12:04:15.581519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.626 #44 NEW cov: 11776 ft: 13696 corp: 7/494b lim: 105 exec/s: 0 rss: 69Mb L: 41/102 MS: 1 CrossOver- 00:10:02.626 [2024-06-11 12:04:15.642133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.626 [2024-06-11 12:04:15.642171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.626 [2024-06-11 12:04:15.642224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.626 [2024-06-11 12:04:15.642245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.626 [2024-06-11 12:04:15.642306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.626 [2024-06-11 12:04:15.642326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.626 [2024-06-11 12:04:15.642393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.626 [2024-06-11 12:04:15.642414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.885 #45 NEW cov: 11776 ft: 13773 corp: 8/590b lim: 105 exec/s: 0 rss: 69Mb L: 96/102 MS: 1 ChangeByte- 00:10:02.885 [2024-06-11 12:04:15.701956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.701993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.702045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.702067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.885 #46 NEW cov: 11776 ft: 13816 corp: 9/652b lim: 105 exec/s: 0 rss: 69Mb L: 62/102 MS: 1 CopyPart- 00:10:02.885 [2024-06-11 12:04:15.752382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.752419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.752477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.752499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.752562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.752582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.752646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.752666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.885 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:02.885 #47 NEW cov: 11799 ft: 13839 corp: 10/748b lim: 105 exec/s: 0 rss: 69Mb L: 96/102 MS: 1 ChangeByte- 00:10:02.885 [2024-06-11 12:04:15.802255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.802293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.802339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704615432 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.802365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.885 #48 NEW cov: 11799 ft: 13933 corp: 11/810b lim: 105 exec/s: 0 rss: 69Mb L: 62/102 MS: 1 ChangeBinInt- 00:10:02.885 [2024-06-11 12:04:15.852671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.852707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.852763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.852784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.852848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.852868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.852931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.852960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:02.885 #49 NEW cov: 11799 ft: 13950 corp: 12/907b lim: 105 exec/s: 49 rss: 69Mb L: 97/102 MS: 1 InsertByte- 00:10:02.885 [2024-06-11 12:04:15.902544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2185 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.902581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:02.885 [2024-06-11 12:04:15.902641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:02.885 [2024-06-11 12:04:15.902663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.144 #50 NEW cov: 11799 ft: 13961 corp: 13/969b lim: 105 exec/s: 50 rss: 69Mb L: 62/102 MS: 1 ChangeBit- 00:10:03.144 [2024-06-11 12:04:15.952668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:15.952705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:15.952751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704615432 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:15.952773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.144 #51 NEW cov: 11799 ft: 13991 corp: 14/1031b lim: 105 exec/s: 51 rss: 69Mb L: 62/102 MS: 1 CMP- DE: "\001\000\177\240\250\034\033."- 00:10:03.144 [2024-06-11 12:04:16.013083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.013120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.013179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.013200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.013262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2049 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.013283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.013347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.013373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.144 #52 NEW cov: 11799 ft: 14013 corp: 15/1134b lim: 105 exec/s: 52 rss: 69Mb L: 103/103 MS: 1 InsertByte- 00:10:03.144 [2024-06-11 12:04:16.072963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.073000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.073045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721606042914824 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.073066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.144 #53 NEW cov: 11799 ft: 14042 corp: 16/1196b lim: 105 exec/s: 53 rss: 69Mb L: 62/103 MS: 1 ChangeByte- 00:10:03.144 [2024-06-11 12:04:16.123412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.123449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.123504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.123525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.123587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.123608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.144 [2024-06-11 12:04:16.123673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.144 [2024-06-11 12:04:16.123693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.144 #54 NEW cov: 11799 ft: 14098 corp: 17/1292b lim: 105 exec/s: 54 rss: 69Mb L: 96/103 MS: 1 CopyPart- 00:10:03.144 [2024-06-11 12:04:16.163133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721391328102408 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.145 [2024-06-11 12:04:16.163167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.403 #55 NEW cov: 11799 ft: 14136 corp: 18/1333b lim: 105 exec/s: 55 rss: 69Mb L: 41/103 MS: 1 ChangeBinInt- 00:10:03.403 [2024-06-11 12:04:16.223449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.223486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.403 [2024-06-11 12:04:16.223532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.223554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.403 #56 NEW cov: 11799 ft: 14170 corp: 19/1395b lim: 105 exec/s: 56 rss: 69Mb L: 62/103 MS: 1 CMP- DE: "\001\017\225'N\022\032d"- 00:10:03.403 [2024-06-11 12:04:16.273448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.273485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.403 #57 NEW cov: 11799 ft: 14190 corp: 20/1436b lim: 105 exec/s: 57 rss: 70Mb L: 41/103 MS: 1 EraseBytes- 00:10:03.403 [2024-06-11 12:04:16.333761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2185 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.333798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.403 [2024-06-11 12:04:16.333843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.333865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.403 #58 NEW cov: 11799 ft: 14225 corp: 21/1487b lim: 105 exec/s: 58 rss: 70Mb L: 51/103 MS: 1 EraseBytes- 00:10:03.403 [2024-06-11 12:04:16.393916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.393953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.403 [2024-06-11 12:04:16.394001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704615432 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.403 [2024-06-11 12:04:16.394022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.403 #59 NEW cov: 11799 ft: 14232 corp: 22/1549b lim: 105 exec/s: 59 rss: 70Mb L: 62/103 MS: 1 PersAutoDict- DE: "\001\000\177\240\250\034\033."- 00:10:03.662 [2024-06-11 12:04:16.444190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2185 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.444227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.444277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1012762419733073422 len:3599 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.444297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.444367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.444389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.662 #60 NEW cov: 11799 ft: 14522 corp: 23/1629b lim: 105 exec/s: 60 rss: 70Mb L: 80/103 MS: 1 InsertRepeatedBytes- 00:10:03.662 [2024-06-11 12:04:16.494502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.494538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.494597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.494618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.494682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.494703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.494768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7306357055647409509 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.494790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.662 #61 NEW cov: 11799 ft: 14529 corp: 24/1730b lim: 105 exec/s: 61 rss: 70Mb L: 101/103 MS: 1 InsertRepeatedBytes- 00:10:03.662 [2024-06-11 12:04:16.534216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721391328102408 len:3990 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.534252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.662 #62 NEW cov: 11799 ft: 14629 corp: 25/1771b lim: 105 exec/s: 62 rss: 70Mb L: 41/103 MS: 1 CMP- DE: "\000\017\225'\234+\021\326"- 00:10:03.662 [2024-06-11 12:04:16.594422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.594462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.662 #63 NEW cov: 11799 ft: 14651 corp: 26/1812b lim: 105 exec/s: 63 rss: 70Mb L: 41/103 MS: 1 ShuffleBytes- 00:10:03.662 [2024-06-11 12:04:16.654996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.655035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.655084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704617480 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.655106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.655168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.655189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.662 [2024-06-11 12:04:16.655252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:7306357055647409509 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.662 [2024-06-11 12:04:16.655273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.922 #64 NEW cov: 11799 ft: 14656 corp: 27/1913b lim: 105 exec/s: 64 rss: 70Mb L: 101/103 MS: 1 ChangeBit- 00:10:03.922 [2024-06-11 12:04:16.715144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.715181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.715238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.715260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.715322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.715343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.715413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.715435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.922 #65 NEW cov: 11799 ft: 14679 corp: 28/2016b lim: 105 exec/s: 65 rss: 70Mb L: 103/103 MS: 1 CopyPart- 00:10:03.922 [2024-06-11 12:04:16.765029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.765065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.765112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704615432 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.765134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.922 #66 NEW cov: 11799 ft: 14716 corp: 29/2078b lim: 105 exec/s: 66 rss: 70Mb L: 62/103 MS: 1 ShuffleBytes- 00:10:03.922 [2024-06-11 12:04:16.825524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.825560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.825617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.825638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.825703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721384701102088 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.825724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.825788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.825809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.922 #67 NEW cov: 11799 ft: 14727 corp: 30/2174b lim: 105 exec/s: 67 rss: 70Mb L: 96/103 MS: 1 ChangeByte- 00:10:03.922 [2024-06-11 12:04:16.865607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:578721382738167816 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.865642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.865701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.865722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.865786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.865807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:03.922 [2024-06-11 12:04:16.865872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:578721382704613384 len:2057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:10:03.922 [2024-06-11 12:04:16.865891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:03.922 #68 NEW cov: 11799 ft: 14781 corp: 31/2270b lim: 105 exec/s: 34 rss: 70Mb L: 96/103 MS: 1 ChangeBit- 00:10:03.922 #68 DONE cov: 11799 ft: 14781 corp: 31/2270b lim: 105 exec/s: 34 rss: 70Mb 00:10:03.922 ###### Recommended dictionary. ###### 00:10:03.922 "\001\000\177\240\250\034\033." # Uses: 1 00:10:03.922 "\001\017\225'N\022\032d" # Uses: 0 00:10:03.922 "\000\017\225'\234+\021\326" # Uses: 0 00:10:03.922 ###### End of recommended dictionary. ###### 00:10:03.922 Done 68 runs in 2 second(s) 00:10:04.182 12:04:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:10:04.182 12:04:17 -- ../common.sh@72 -- # (( i++ )) 00:10:04.182 12:04:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:04.182 12:04:17 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:10:04.182 12:04:17 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:10:04.182 12:04:17 -- nvmf/run.sh@24 -- # local timen=1 00:10:04.182 12:04:17 -- nvmf/run.sh@25 -- # local core=0x1 00:10:04.182 12:04:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:10:04.182 12:04:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:10:04.182 12:04:17 -- nvmf/run.sh@29 -- # printf %02d 17 00:10:04.182 12:04:17 -- nvmf/run.sh@29 -- # port=4417 00:10:04.182 12:04:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:10:04.182 12:04:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:10:04.182 12:04:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:04.182 12:04:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:10:04.182 [2024-06-11 12:04:17.088119] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:04.182 [2024-06-11 12:04:17.088211] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2697036 ] 00:10:04.182 EAL: No free 2048 kB hugepages reported on node 1 00:10:04.441 [2024-06-11 12:04:17.346945] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:04.441 [2024-06-11 12:04:17.373104] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:04.441 [2024-06-11 12:04:17.373281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:04.441 [2024-06-11 12:04:17.427720] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:04.441 [2024-06-11 12:04:17.443956] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:10:04.441 INFO: Running with entropic power schedule (0xFF, 100). 00:10:04.441 INFO: Seed: 3463304669 00:10:04.700 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:04.700 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:04.700 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:10:04.700 INFO: A corpus is not provided, starting from an empty corpus 00:10:04.700 #2 INITED exec/s: 0 rss: 61Mb 00:10:04.700 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:04.700 This may also happen if the target rejected all inputs we tried so far 00:10:04.700 [2024-06-11 12:04:17.493262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:04.700 [2024-06-11 12:04:17.493303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.700 [2024-06-11 12:04:17.493349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:04.700 [2024-06-11 12:04:17.493377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:04.700 [2024-06-11 12:04:17.493441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:04.700 [2024-06-11 12:04:17.493463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:04.959 NEW_FUNC[1/665]: 0x4b8420 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:10:04.959 NEW_FUNC[2/665]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:04.959 #7 NEW cov: 11593 ft: 11594 corp: 2/89b lim: 120 exec/s: 0 rss: 68Mb L: 88/88 MS: 5 CopyPart-ChangeByte-InsertByte-InsertByte-InsertRepeatedBytes- 00:10:04.959 [2024-06-11 12:04:17.964479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:04.959 [2024-06-11 12:04:17.964528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:04.959 [2024-06-11 12:04:17.964600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:04.959 [2024-06-11 12:04:17.964621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:04.959 [2024-06-11 12:04:17.964688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:04.959 [2024-06-11 12:04:17.964710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.218 #18 NEW cov: 11706 ft: 12028 corp: 3/178b lim: 120 exec/s: 0 rss: 68Mb L: 89/89 MS: 1 InsertByte- 00:10:05.218 [2024-06-11 12:04:18.024557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.024599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.024643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.024664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.024730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460204 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.024752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.218 #19 NEW cov: 11712 ft: 12425 corp: 4/266b lim: 120 exec/s: 0 rss: 68Mb L: 88/89 MS: 1 ChangeByte- 00:10:05.218 [2024-06-11 12:04:18.074643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.074682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.074732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.074755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.074820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.074842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.218 #20 NEW cov: 11797 ft: 12637 corp: 5/348b lim: 120 exec/s: 0 rss: 68Mb L: 82/89 MS: 1 EraseBytes- 00:10:05.218 [2024-06-11 12:04:18.134842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.134880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.134930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.134951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.135015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.135037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.218 #21 NEW cov: 11797 ft: 12732 corp: 6/430b lim: 120 exec/s: 0 rss: 69Mb L: 82/89 MS: 1 ChangeBit- 00:10:05.218 [2024-06-11 12:04:18.194995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.195032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.195086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.195107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.195174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.195195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.218 #22 NEW cov: 11797 ft: 12817 corp: 7/519b lim: 120 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 InsertByte- 00:10:05.218 [2024-06-11 12:04:18.245398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.245435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.245497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1953184666274372379 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.245520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.245583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.245605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.218 [2024-06-11 12:04:18.245670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.218 [2024-06-11 12:04:18.245691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:05.477 #23 NEW cov: 11797 ft: 13249 corp: 8/615b lim: 120 exec/s: 0 rss: 69Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:10:05.477 [2024-06-11 12:04:18.305294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.305331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.305390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.305412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.305477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.305499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.477 #24 NEW cov: 11797 ft: 13344 corp: 9/703b lim: 120 exec/s: 0 rss: 69Mb L: 88/96 MS: 1 ChangeBinInt- 00:10:05.477 [2024-06-11 12:04:18.355524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.355565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.355610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.355631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.355695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.355714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.477 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:05.477 #25 NEW cov: 11820 ft: 13390 corp: 10/785b lim: 120 exec/s: 0 rss: 69Mb L: 82/96 MS: 1 ChangeByte- 00:10:05.477 [2024-06-11 12:04:18.415692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436293089389577734 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.415730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.415783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.415804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.415868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.415888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.477 #26 NEW cov: 11820 ft: 13429 corp: 11/874b lim: 120 exec/s: 0 rss: 69Mb L: 89/96 MS: 1 ChangeBit- 00:10:05.477 [2024-06-11 12:04:18.465636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.465675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.477 [2024-06-11 12:04:18.465724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.477 [2024-06-11 12:04:18.465746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.477 #27 NEW cov: 11820 ft: 13797 corp: 12/934b lim: 120 exec/s: 27 rss: 69Mb L: 60/96 MS: 1 EraseBytes- 00:10:05.736 [2024-06-11 12:04:18.515896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.515934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.515983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.516003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.516069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.516091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.736 #28 NEW cov: 11820 ft: 13832 corp: 13/1022b lim: 120 exec/s: 28 rss: 69Mb L: 88/96 MS: 1 ShuffleBytes- 00:10:05.736 [2024-06-11 12:04:18.556278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.556316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.556381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1953184666274372379 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.556404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.556468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.556490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.556553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.556573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:05.736 #29 NEW cov: 11820 ft: 13870 corp: 14/1118b lim: 120 exec/s: 29 rss: 69Mb L: 96/96 MS: 1 ChangeBit- 00:10:05.736 [2024-06-11 12:04:18.616258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.616295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.616346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1570 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.616374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.616439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.616461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.736 #30 NEW cov: 11820 ft: 13889 corp: 15/1208b lim: 120 exec/s: 30 rss: 69Mb L: 90/96 MS: 1 InsertByte- 00:10:05.736 [2024-06-11 12:04:18.656440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.656479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.656528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.656549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.656614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.656633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.736 #31 NEW cov: 11820 ft: 13957 corp: 16/1290b lim: 120 exec/s: 31 rss: 69Mb L: 82/96 MS: 1 ChangeASCIIInt- 00:10:05.736 [2024-06-11 12:04:18.716594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.716632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.716683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.716705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.736 [2024-06-11 12:04:18.716766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.736 [2024-06-11 12:04:18.716788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.736 #32 NEW cov: 11820 ft: 13971 corp: 17/1378b lim: 120 exec/s: 32 rss: 69Mb L: 88/96 MS: 1 CrossOver- 00:10:05.995 [2024-06-11 12:04:18.776572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.995 [2024-06-11 12:04:18.776609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.995 [2024-06-11 12:04:18.776653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.995 [2024-06-11 12:04:18.776675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.995 #33 NEW cov: 11820 ft: 14024 corp: 18/1445b lim: 120 exec/s: 33 rss: 70Mb L: 67/96 MS: 1 EraseBytes- 00:10:05.995 [2024-06-11 12:04:18.836607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15205847816531279871 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.995 [2024-06-11 12:04:18.836644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.996 #38 NEW cov: 11820 ft: 14839 corp: 19/1480b lim: 120 exec/s: 38 rss: 70Mb L: 35/96 MS: 5 InsertByte-ChangeBit-ChangeBinInt-CMP-CrossOver- DE: "\376\377\377\377"- 00:10:05.996 [2024-06-11 12:04:18.897095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.897133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.996 [2024-06-11 12:04:18.897184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.897205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.996 [2024-06-11 12:04:18.897272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:38663 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.897294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.996 #39 NEW cov: 11820 ft: 14856 corp: 20/1569b lim: 120 exec/s: 39 rss: 70Mb L: 89/96 MS: 1 InsertByte- 00:10:05.996 [2024-06-11 12:04:18.947283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.947319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.996 [2024-06-11 12:04:18.947366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.947388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:05.996 [2024-06-11 12:04:18.947453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18012701988911187699 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.947480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:05.996 #40 NEW cov: 11820 ft: 14866 corp: 21/1657b lim: 120 exec/s: 40 rss: 70Mb L: 88/96 MS: 1 ChangeBinInt- 00:10:05.996 [2024-06-11 12:04:18.987007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:491743407975564806 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:05.996 [2024-06-11 12:04:18.987043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:05.996 #44 NEW cov: 11820 ft: 14878 corp: 22/1690b lim: 120 exec/s: 44 rss: 70Mb L: 33/96 MS: 4 CrossOver-ChangeByte-ChangeBit-CrossOver- 00:10:06.254 [2024-06-11 12:04:19.037714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3540385792 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.037751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.037809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.037831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.037894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.037917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.037981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.038002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:06.255 #50 NEW cov: 11820 ft: 14918 corp: 23/1798b lim: 120 exec/s: 50 rss: 70Mb L: 108/108 MS: 1 InsertRepeatedBytes- 00:10:06.255 [2024-06-11 12:04:19.087304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15205847816531279871 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.087342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.255 #51 NEW cov: 11820 ft: 14964 corp: 24/1833b lim: 120 exec/s: 51 rss: 70Mb L: 35/108 MS: 1 ChangeByte- 00:10:06.255 [2024-06-11 12:04:19.147831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041289575892486 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.147870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.147913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.147936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.147999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.148020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.255 #52 NEW cov: 11820 ft: 14972 corp: 25/1915b lim: 120 exec/s: 52 rss: 70Mb L: 82/108 MS: 1 PersAutoDict- DE: "\376\377\377\377"- 00:10:06.255 [2024-06-11 12:04:19.198134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.198175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.198225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.198247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.198311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:38663 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.198331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.198411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.198432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:06.255 #53 NEW cov: 11820 ft: 14987 corp: 26/2019b lim: 120 exec/s: 53 rss: 70Mb L: 104/108 MS: 1 CrossOver- 00:10:06.255 [2024-06-11 12:04:19.258137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436293089389577734 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.258175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.258220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.258242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.255 [2024-06-11 12:04:19.258308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.255 [2024-06-11 12:04:19.258329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.514 #54 NEW cov: 11820 ft: 15007 corp: 27/2108b lim: 120 exec/s: 54 rss: 70Mb L: 89/108 MS: 1 ShuffleBytes- 00:10:06.514 [2024-06-11 12:04:19.318312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436293089389577734 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.318350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.318404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.318424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.318487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.318509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.514 #60 NEW cov: 11820 ft: 15021 corp: 28/2197b lim: 120 exec/s: 60 rss: 70Mb L: 89/108 MS: 1 ChangeASCIIInt- 00:10:06.514 [2024-06-11 12:04:19.378520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.378559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.378607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.378632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.378697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1790 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.378719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.514 #61 NEW cov: 11820 ft: 15054 corp: 29/2285b lim: 120 exec/s: 61 rss: 70Mb L: 88/108 MS: 1 ShuffleBytes- 00:10:06.514 [2024-06-11 12:04:19.428662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041037109003782 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.428700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.428754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.428775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.428838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.428860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.514 #62 NEW cov: 11820 ft: 15069 corp: 30/2376b lim: 120 exec/s: 62 rss: 70Mb L: 91/108 MS: 1 CrossOver- 00:10:06.514 [2024-06-11 12:04:19.468732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:434041040467789318 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.468769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.468820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.468841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:06.514 [2024-06-11 12:04:19.468906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:434041037028460038 len:1543 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:06.514 [2024-06-11 12:04:19.468927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:06.514 #63 NEW cov: 11820 ft: 15075 corp: 31/2465b lim: 120 exec/s: 31 rss: 70Mb L: 89/108 MS: 1 InsertByte- 00:10:06.514 #63 DONE cov: 11820 ft: 15075 corp: 31/2465b lim: 120 exec/s: 31 rss: 70Mb 00:10:06.514 ###### Recommended dictionary. ###### 00:10:06.514 "\376\377\377\377" # Uses: 2 00:10:06.514 ###### End of recommended dictionary. ###### 00:10:06.514 Done 63 runs in 2 second(s) 00:10:06.773 12:04:19 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:10:06.773 12:04:19 -- ../common.sh@72 -- # (( i++ )) 00:10:06.773 12:04:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:06.773 12:04:19 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:10:06.773 12:04:19 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:10:06.773 12:04:19 -- nvmf/run.sh@24 -- # local timen=1 00:10:06.773 12:04:19 -- nvmf/run.sh@25 -- # local core=0x1 00:10:06.773 12:04:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:10:06.773 12:04:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:10:06.773 12:04:19 -- nvmf/run.sh@29 -- # printf %02d 18 00:10:06.773 12:04:19 -- nvmf/run.sh@29 -- # port=4418 00:10:06.773 12:04:19 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:10:06.773 12:04:19 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:10:06.773 12:04:19 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:06.774 12:04:19 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:10:06.774 [2024-06-11 12:04:19.690503] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:06.774 [2024-06-11 12:04:19.690595] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2697400 ] 00:10:06.774 EAL: No free 2048 kB hugepages reported on node 1 00:10:07.032 [2024-06-11 12:04:19.946652] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.032 [2024-06-11 12:04:19.972795] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:07.032 [2024-06-11 12:04:19.972967] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.032 [2024-06-11 12:04:20.027716] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:07.032 [2024-06-11 12:04:20.043967] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:10:07.032 INFO: Running with entropic power schedule (0xFF, 100). 00:10:07.032 INFO: Seed: 1769350059 00:10:07.291 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:07.291 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:07.291 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:10:07.291 INFO: A corpus is not provided, starting from an empty corpus 00:10:07.291 #2 INITED exec/s: 0 rss: 61Mb 00:10:07.291 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:07.291 This may also happen if the target rejected all inputs we tried so far 00:10:07.291 [2024-06-11 12:04:20.099565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.291 [2024-06-11 12:04:20.099608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.291 [2024-06-11 12:04:20.099652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:07.291 [2024-06-11 12:04:20.099672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.291 [2024-06-11 12:04:20.099735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:07.291 [2024-06-11 12:04:20.099755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.550 NEW_FUNC[1/663]: 0x4bbc80 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:10:07.550 NEW_FUNC[2/663]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:07.550 #6 NEW cov: 11523 ft: 11535 corp: 2/68b lim: 100 exec/s: 0 rss: 68Mb L: 67/67 MS: 4 ChangeBinInt-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:10:07.550 [2024-06-11 12:04:20.570798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.550 [2024-06-11 12:04:20.570860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.550 [2024-06-11 12:04:20.570940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:07.550 [2024-06-11 12:04:20.570968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.809 #14 NEW cov: 11650 ft: 12372 corp: 3/122b lim: 100 exec/s: 0 rss: 68Mb L: 54/67 MS: 3 InsertRepeatedBytes-CrossOver-InsertRepeatedBytes- 00:10:07.809 [2024-06-11 12:04:20.620792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.810 [2024-06-11 12:04:20.620830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.810 [2024-06-11 12:04:20.620880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:07.810 [2024-06-11 12:04:20.620900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.810 [2024-06-11 12:04:20.620961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:07.810 [2024-06-11 12:04:20.620981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.810 #15 NEW cov: 11656 ft: 12569 corp: 4/189b lim: 100 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 CrossOver- 00:10:07.810 [2024-06-11 12:04:20.680896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.810 [2024-06-11 12:04:20.680933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.810 [2024-06-11 12:04:20.680997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:07.810 [2024-06-11 12:04:20.681016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.810 #16 NEW cov: 11741 ft: 12756 corp: 5/238b lim: 100 exec/s: 0 rss: 69Mb L: 49/67 MS: 1 InsertRepeatedBytes- 00:10:07.810 [2024-06-11 12:04:20.731149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.810 [2024-06-11 12:04:20.731186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.810 [2024-06-11 12:04:20.731234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:07.810 [2024-06-11 12:04:20.731253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:07.810 [2024-06-11 12:04:20.731316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:07.810 [2024-06-11 12:04:20.731336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:07.810 #17 NEW cov: 11741 ft: 12814 corp: 6/306b lim: 100 exec/s: 0 rss: 69Mb L: 68/68 MS: 1 InsertByte- 00:10:07.810 [2024-06-11 12:04:20.781063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.810 [2024-06-11 12:04:20.781098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.810 #18 NEW cov: 11741 ft: 13220 corp: 7/338b lim: 100 exec/s: 0 rss: 69Mb L: 32/68 MS: 1 InsertRepeatedBytes- 00:10:07.810 [2024-06-11 12:04:20.831268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:07.810 [2024-06-11 12:04:20.831301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:07.810 [2024-06-11 12:04:20.831346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:07.810 [2024-06-11 12:04:20.831371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.068 #19 NEW cov: 11741 ft: 13264 corp: 8/379b lim: 100 exec/s: 0 rss: 69Mb L: 41/68 MS: 1 EraseBytes- 00:10:08.068 [2024-06-11 12:04:20.891537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.068 [2024-06-11 12:04:20.891572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.068 [2024-06-11 12:04:20.891616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.068 [2024-06-11 12:04:20.891641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.068 #22 NEW cov: 11741 ft: 13353 corp: 9/437b lim: 100 exec/s: 0 rss: 69Mb L: 58/68 MS: 3 CrossOver-InsertRepeatedBytes-InsertRepeatedBytes- 00:10:08.068 [2024-06-11 12:04:20.931860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.068 [2024-06-11 12:04:20.931895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.068 [2024-06-11 12:04:20.931953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.068 [2024-06-11 12:04:20.931972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.068 [2024-06-11 12:04:20.932035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.068 [2024-06-11 12:04:20.932054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.068 [2024-06-11 12:04:20.932116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:08.068 [2024-06-11 12:04:20.932136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:08.068 #23 NEW cov: 11741 ft: 13665 corp: 10/519b lim: 100 exec/s: 0 rss: 69Mb L: 82/82 MS: 1 CopyPart- 00:10:08.068 [2024-06-11 12:04:20.991631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.068 [2024-06-11 12:04:20.991666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.068 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:08.068 #24 NEW cov: 11764 ft: 13746 corp: 11/551b lim: 100 exec/s: 0 rss: 69Mb L: 32/82 MS: 1 ChangeBit- 00:10:08.068 [2024-06-11 12:04:21.052062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.068 [2024-06-11 12:04:21.052097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.068 [2024-06-11 12:04:21.052151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.068 [2024-06-11 12:04:21.052171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.068 [2024-06-11 12:04:21.052233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.068 [2024-06-11 12:04:21.052250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.068 #25 NEW cov: 11764 ft: 13765 corp: 12/620b lim: 100 exec/s: 25 rss: 69Mb L: 69/82 MS: 1 InsertByte- 00:10:08.327 [2024-06-11 12:04:21.112135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.328 [2024-06-11 12:04:21.112171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.112212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.328 [2024-06-11 12:04:21.112232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.328 #26 NEW cov: 11764 ft: 13814 corp: 13/664b lim: 100 exec/s: 26 rss: 69Mb L: 44/82 MS: 1 CrossOver- 00:10:08.328 [2024-06-11 12:04:21.172322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.328 [2024-06-11 12:04:21.172357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.172411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.328 [2024-06-11 12:04:21.172433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.328 #27 NEW cov: 11764 ft: 13838 corp: 14/718b lim: 100 exec/s: 27 rss: 69Mb L: 54/82 MS: 1 ShuffleBytes- 00:10:08.328 [2024-06-11 12:04:21.222572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.328 [2024-06-11 12:04:21.222606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.222648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.328 [2024-06-11 12:04:21.222668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.222729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.328 [2024-06-11 12:04:21.222748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.328 #28 NEW cov: 11764 ft: 13925 corp: 15/787b lim: 100 exec/s: 28 rss: 70Mb L: 69/82 MS: 1 InsertByte- 00:10:08.328 [2024-06-11 12:04:21.272679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.328 [2024-06-11 12:04:21.272713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.272769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.328 [2024-06-11 12:04:21.272790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.272849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.328 [2024-06-11 12:04:21.272869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.328 #29 NEW cov: 11764 ft: 14016 corp: 16/857b lim: 100 exec/s: 29 rss: 70Mb L: 70/82 MS: 1 InsertRepeatedBytes- 00:10:08.328 [2024-06-11 12:04:21.332905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.328 [2024-06-11 12:04:21.332939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.332982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.328 [2024-06-11 12:04:21.333001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.328 [2024-06-11 12:04:21.333063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.328 [2024-06-11 12:04:21.333083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.587 #30 NEW cov: 11764 ft: 14054 corp: 17/926b lim: 100 exec/s: 30 rss: 70Mb L: 69/82 MS: 1 ChangeBinInt- 00:10:08.587 [2024-06-11 12:04:21.393126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.587 [2024-06-11 12:04:21.393159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.393205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.587 [2024-06-11 12:04:21.393224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.393287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.587 [2024-06-11 12:04:21.393306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.587 #31 NEW cov: 11764 ft: 14139 corp: 18/993b lim: 100 exec/s: 31 rss: 70Mb L: 67/82 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:10:08.587 [2024-06-11 12:04:21.443234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.587 [2024-06-11 12:04:21.443269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.443323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.587 [2024-06-11 12:04:21.443344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.443410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.587 [2024-06-11 12:04:21.443430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.587 #32 NEW cov: 11764 ft: 14152 corp: 19/1065b lim: 100 exec/s: 32 rss: 70Mb L: 72/82 MS: 1 InsertRepeatedBytes- 00:10:08.587 [2024-06-11 12:04:21.493504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.587 [2024-06-11 12:04:21.493540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.493599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.587 [2024-06-11 12:04:21.493619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.493680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.587 [2024-06-11 12:04:21.493700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.493764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:08.587 [2024-06-11 12:04:21.493784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:08.587 #34 NEW cov: 11764 ft: 14198 corp: 20/1162b lim: 100 exec/s: 34 rss: 70Mb L: 97/97 MS: 2 EraseBytes-CrossOver- 00:10:08.587 [2024-06-11 12:04:21.543634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.587 [2024-06-11 12:04:21.543668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.543727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.587 [2024-06-11 12:04:21.543747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.543808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.587 [2024-06-11 12:04:21.543826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.543886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:10:08.587 [2024-06-11 12:04:21.543905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:08.587 #35 NEW cov: 11764 ft: 14223 corp: 21/1243b lim: 100 exec/s: 35 rss: 70Mb L: 81/97 MS: 1 CrossOver- 00:10:08.587 [2024-06-11 12:04:21.603718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.587 [2024-06-11 12:04:21.603754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.603806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.587 [2024-06-11 12:04:21.603826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.587 [2024-06-11 12:04:21.603890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.587 [2024-06-11 12:04:21.603910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.847 #36 NEW cov: 11764 ft: 14269 corp: 22/1315b lim: 100 exec/s: 36 rss: 70Mb L: 72/97 MS: 1 ChangeByte- 00:10:08.847 [2024-06-11 12:04:21.663879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.847 [2024-06-11 12:04:21.663915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.663970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.847 [2024-06-11 12:04:21.663990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.664054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.847 [2024-06-11 12:04:21.664073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.847 #37 NEW cov: 11764 ft: 14307 corp: 23/1392b lim: 100 exec/s: 37 rss: 70Mb L: 77/97 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:10:08.847 [2024-06-11 12:04:21.713904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.847 [2024-06-11 12:04:21.713938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.713983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.847 [2024-06-11 12:04:21.714003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.847 #38 NEW cov: 11764 ft: 14326 corp: 24/1450b lim: 100 exec/s: 38 rss: 70Mb L: 58/97 MS: 1 ChangeByte- 00:10:08.847 [2024-06-11 12:04:21.764163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.847 [2024-06-11 12:04:21.764200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.764250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.847 [2024-06-11 12:04:21.764270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.764332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.847 [2024-06-11 12:04:21.764351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.847 #39 NEW cov: 11764 ft: 14365 corp: 25/1515b lim: 100 exec/s: 39 rss: 70Mb L: 65/97 MS: 1 CopyPart- 00:10:08.847 [2024-06-11 12:04:21.804260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.847 [2024-06-11 12:04:21.804296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.804349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.847 [2024-06-11 12:04:21.804376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.804438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.847 [2024-06-11 12:04:21.804457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:08.847 #40 NEW cov: 11764 ft: 14385 corp: 26/1590b lim: 100 exec/s: 40 rss: 70Mb L: 75/97 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:10:08.847 [2024-06-11 12:04:21.864447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:08.847 [2024-06-11 12:04:21.864483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.864535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:08.847 [2024-06-11 12:04:21.864555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:08.847 [2024-06-11 12:04:21.864619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:08.847 [2024-06-11 12:04:21.864637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.107 #41 NEW cov: 11764 ft: 14395 corp: 27/1657b lim: 100 exec/s: 41 rss: 70Mb L: 67/97 MS: 1 ChangeByte- 00:10:09.107 [2024-06-11 12:04:21.914612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.107 [2024-06-11 12:04:21.914648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:21.914691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.107 [2024-06-11 12:04:21.914711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:21.914770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.107 [2024-06-11 12:04:21.914789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.107 #42 NEW cov: 11764 ft: 14396 corp: 28/1717b lim: 100 exec/s: 42 rss: 70Mb L: 60/97 MS: 1 InsertRepeatedBytes- 00:10:09.107 [2024-06-11 12:04:21.964756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.107 [2024-06-11 12:04:21.964791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:21.964845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.107 [2024-06-11 12:04:21.964865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:21.964929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.107 [2024-06-11 12:04:21.964949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.107 #43 NEW cov: 11764 ft: 14405 corp: 29/1784b lim: 100 exec/s: 43 rss: 70Mb L: 67/97 MS: 1 ChangeByte- 00:10:09.107 [2024-06-11 12:04:22.004865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.107 [2024-06-11 12:04:22.004901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:22.004952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.107 [2024-06-11 12:04:22.004973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:22.005036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.107 [2024-06-11 12:04:22.005056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.107 #44 NEW cov: 11764 ft: 14412 corp: 30/1856b lim: 100 exec/s: 44 rss: 70Mb L: 72/97 MS: 1 ChangeBinInt- 00:10:09.107 [2024-06-11 12:04:22.065370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:10:09.107 [2024-06-11 12:04:22.065409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:22.065461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:10:09.107 [2024-06-11 12:04:22.065481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:09.107 [2024-06-11 12:04:22.065546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:10:09.107 [2024-06-11 12:04:22.065565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:09.107 #45 NEW cov: 11764 ft: 14459 corp: 31/1935b lim: 100 exec/s: 22 rss: 71Mb L: 79/97 MS: 1 CrossOver- 00:10:09.107 #45 DONE cov: 11764 ft: 14459 corp: 31/1935b lim: 100 exec/s: 22 rss: 71Mb 00:10:09.107 ###### Recommended dictionary. ###### 00:10:09.107 "\001\000\000\000\000\000\000\001" # Uses: 2 00:10:09.107 ###### End of recommended dictionary. ###### 00:10:09.107 Done 45 runs in 2 second(s) 00:10:09.367 12:04:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:10:09.367 12:04:22 -- ../common.sh@72 -- # (( i++ )) 00:10:09.367 12:04:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:09.367 12:04:22 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:10:09.367 12:04:22 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:10:09.367 12:04:22 -- nvmf/run.sh@24 -- # local timen=1 00:10:09.367 12:04:22 -- nvmf/run.sh@25 -- # local core=0x1 00:10:09.367 12:04:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:10:09.367 12:04:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:10:09.367 12:04:22 -- nvmf/run.sh@29 -- # printf %02d 19 00:10:09.367 12:04:22 -- nvmf/run.sh@29 -- # port=4419 00:10:09.367 12:04:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:10:09.367 12:04:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:10:09.367 12:04:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:09.367 12:04:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:10:09.367 [2024-06-11 12:04:22.286921] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:09.367 [2024-06-11 12:04:22.286996] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2697767 ] 00:10:09.367 EAL: No free 2048 kB hugepages reported on node 1 00:10:09.627 [2024-06-11 12:04:22.537948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:09.627 [2024-06-11 12:04:22.564615] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:09.627 [2024-06-11 12:04:22.564790] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:09.627 [2024-06-11 12:04:22.619255] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:09.627 [2024-06-11 12:04:22.635498] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:10:09.627 INFO: Running with entropic power schedule (0xFF, 100). 00:10:09.627 INFO: Seed: 66377258 00:10:09.886 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:09.886 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:09.886 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:10:09.886 INFO: A corpus is not provided, starting from an empty corpus 00:10:09.886 #2 INITED exec/s: 0 rss: 61Mb 00:10:09.886 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:09.886 This may also happen if the target rejected all inputs we tried so far 00:10:09.886 [2024-06-11 12:04:22.700870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:09.886 [2024-06-11 12:04:22.700911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.145 NEW_FUNC[1/660]: 0x4bec40 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:10:10.145 NEW_FUNC[2/660]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:10.145 #10 NEW cov: 11473 ft: 11516 corp: 2/19b lim: 50 exec/s: 0 rss: 68Mb L: 18/18 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:10:10.145 [2024-06-11 12:04:23.172200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.145 [2024-06-11 12:04:23.172264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.404 NEW_FUNC[1/3]: 0x197a180 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:10:10.404 NEW_FUNC[2/3]: 0x197b8d0 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:10:10.404 #16 NEW cov: 11628 ft: 12057 corp: 3/37b lim: 50 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 CMP- DE: "\022\000\000\000"- 00:10:10.404 [2024-06-11 12:04:23.232281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:221239332432970499 len:4 00:10:10.404 [2024-06-11 12:04:23.232320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.404 [2024-06-11 12:04:23.232384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:772 00:10:10.404 [2024-06-11 12:04:23.232407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.404 #17 NEW cov: 11634 ft: 12715 corp: 4/59b lim: 50 exec/s: 0 rss: 69Mb L: 22/22 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:10:10.404 [2024-06-11 12:04:23.282283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.404 [2024-06-11 12:04:23.282321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.404 #18 NEW cov: 11719 ft: 13024 corp: 5/77b lim: 50 exec/s: 0 rss: 69Mb L: 18/22 MS: 1 ChangeBinInt- 00:10:10.404 [2024-06-11 12:04:23.342501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.404 [2024-06-11 12:04:23.342539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.404 #19 NEW cov: 11719 ft: 13158 corp: 6/95b lim: 50 exec/s: 0 rss: 69Mb L: 18/22 MS: 1 ShuffleBytes- 00:10:10.404 [2024-06-11 12:04:23.392609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.404 [2024-06-11 12:04:23.392646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.404 #20 NEW cov: 11719 ft: 13201 corp: 7/113b lim: 50 exec/s: 0 rss: 69Mb L: 18/22 MS: 1 ChangeBinInt- 00:10:10.663 [2024-06-11 12:04:23.452920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.663 [2024-06-11 12:04:23.452958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.663 [2024-06-11 12:04:23.453018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3311470318080 len:772 00:10:10.663 [2024-06-11 12:04:23.453040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.663 #21 NEW cov: 11719 ft: 13316 corp: 8/135b lim: 50 exec/s: 0 rss: 69Mb L: 22/22 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:10:10.663 [2024-06-11 12:04:23.503047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.663 [2024-06-11 12:04:23.503085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.663 [2024-06-11 12:04:23.503134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:50532864 len:772 00:10:10.663 [2024-06-11 12:04:23.503156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.663 #22 NEW cov: 11719 ft: 13339 corp: 9/163b lim: 50 exec/s: 0 rss: 69Mb L: 28/28 MS: 1 CopyPart- 00:10:10.663 [2024-06-11 12:04:23.563253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020518514240515 len:772 00:10:10.663 [2024-06-11 12:04:23.563289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.663 [2024-06-11 12:04:23.563335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:779 00:10:10.663 [2024-06-11 12:04:23.563356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.663 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:10.663 #27 NEW cov: 11742 ft: 13382 corp: 10/185b lim: 50 exec/s: 0 rss: 69Mb L: 22/28 MS: 5 PersAutoDict-CrossOver-EraseBytes-ShuffleBytes-CrossOver- DE: "\022\000\000\000"- 00:10:10.663 [2024-06-11 12:04:23.613389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4540476160789916675 len:772 00:10:10.663 [2024-06-11 12:04:23.613426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.663 [2024-06-11 12:04:23.613482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:772 00:10:10.664 [2024-06-11 12:04:23.613505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.664 #28 NEW cov: 11742 ft: 13406 corp: 11/208b lim: 50 exec/s: 0 rss: 69Mb L: 23/28 MS: 1 InsertByte- 00:10:10.664 [2024-06-11 12:04:23.673541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:65536 00:10:10.664 [2024-06-11 12:04:23.673577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.664 [2024-06-11 12:04:23.673624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:217037002531603203 len:2305 00:10:10.664 [2024-06-11 12:04:23.673646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:10.923 #29 NEW cov: 11742 ft: 13433 corp: 12/229b lim: 50 exec/s: 29 rss: 69Mb L: 21/28 MS: 1 InsertRepeatedBytes- 00:10:10.923 [2024-06-11 12:04:23.733612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020518765888259 len:772 00:10:10.923 [2024-06-11 12:04:23.733648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.923 #30 NEW cov: 11742 ft: 13450 corp: 13/247b lim: 50 exec/s: 30 rss: 69Mb L: 18/28 MS: 1 ChangeBinInt- 00:10:10.923 [2024-06-11 12:04:23.783725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.923 [2024-06-11 12:04:23.783764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.923 #31 NEW cov: 11742 ft: 13479 corp: 14/265b lim: 50 exec/s: 31 rss: 69Mb L: 18/28 MS: 1 ShuffleBytes- 00:10:10.923 [2024-06-11 12:04:23.833876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.923 [2024-06-11 12:04:23.833912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.923 #32 NEW cov: 11742 ft: 13523 corp: 15/283b lim: 50 exec/s: 32 rss: 69Mb L: 18/28 MS: 1 ChangeByte- 00:10:10.923 [2024-06-11 12:04:23.874013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:10.923 [2024-06-11 12:04:23.874051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:10.923 #33 NEW cov: 11742 ft: 13554 corp: 16/301b lim: 50 exec/s: 33 rss: 70Mb L: 18/28 MS: 1 CMP- DE: "\377\377~u \017\313("- 00:10:10.923 [2024-06-11 12:04:23.934182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519557104387 len:772 00:10:10.923 [2024-06-11 12:04:23.934219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.183 #34 NEW cov: 11742 ft: 13617 corp: 17/320b lim: 50 exec/s: 34 rss: 70Mb L: 19/28 MS: 1 InsertByte- 00:10:11.183 [2024-06-11 12:04:23.974275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446601640802517763 len:8208 00:10:11.183 [2024-06-11 12:04:23.974312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.183 #35 NEW cov: 11742 ft: 13688 corp: 18/339b lim: 50 exec/s: 35 rss: 70Mb L: 19/28 MS: 1 PersAutoDict- DE: "\377\377~u \017\313("- 00:10:11.183 [2024-06-11 12:04:24.034486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:11.183 [2024-06-11 12:04:24.034522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.183 #36 NEW cov: 11742 ft: 13700 corp: 19/358b lim: 50 exec/s: 36 rss: 70Mb L: 19/28 MS: 1 InsertByte- 00:10:11.183 [2024-06-11 12:04:24.094634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:772 00:10:11.183 [2024-06-11 12:04:24.094671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.183 #37 NEW cov: 11742 ft: 13708 corp: 20/376b lim: 50 exec/s: 37 rss: 70Mb L: 18/28 MS: 1 ChangeBit- 00:10:11.183 [2024-06-11 12:04:24.135014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:847737088311296 len:772 00:10:11.183 [2024-06-11 12:04:24.135050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.183 [2024-06-11 12:04:24.135094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:221239331745104643 len:1 00:10:11.183 [2024-06-11 12:04:24.135116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.183 [2024-06-11 12:04:24.135179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:217020518463898371 len:779 00:10:11.183 [2024-06-11 12:04:24.135202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:11.183 #38 NEW cov: 11742 ft: 13988 corp: 21/407b lim: 50 exec/s: 38 rss: 70Mb L: 31/31 MS: 1 CopyPart- 00:10:11.183 [2024-06-11 12:04:24.194949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:3844 00:10:11.183 [2024-06-11 12:04:24.194985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.442 #39 NEW cov: 11742 ft: 14036 corp: 22/426b lim: 50 exec/s: 39 rss: 70Mb L: 19/31 MS: 1 InsertByte- 00:10:11.442 [2024-06-11 12:04:24.245079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:847737088312066 len:772 00:10:11.442 [2024-06-11 12:04:24.245115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.442 #40 NEW cov: 11742 ft: 14043 corp: 23/444b lim: 50 exec/s: 40 rss: 70Mb L: 18/31 MS: 1 CMP- DE: "\002\000"- 00:10:11.442 [2024-06-11 12:04:24.305646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:870 00:10:11.442 [2024-06-11 12:04:24.305683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.442 [2024-06-11 12:04:24.305728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7306357456645743973 len:25958 00:10:11.442 [2024-06-11 12:04:24.305750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.442 [2024-06-11 12:04:24.305811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:7306357456645743973 len:25958 00:10:11.442 [2024-06-11 12:04:24.305832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:11.442 [2024-06-11 12:04:24.305896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:7306357035738948965 len:887 00:10:11.442 [2024-06-11 12:04:24.305917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:11.442 #41 NEW cov: 11742 ft: 14300 corp: 24/490b lim: 50 exec/s: 41 rss: 70Mb L: 46/46 MS: 1 InsertRepeatedBytes- 00:10:11.442 [2024-06-11 12:04:24.355393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519202095875 len:3841 00:10:11.442 [2024-06-11 12:04:24.355429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.442 #42 NEW cov: 11742 ft: 14330 corp: 25/502b lim: 50 exec/s: 42 rss: 70Mb L: 12/46 MS: 1 EraseBytes- 00:10:11.442 [2024-06-11 12:04:24.415583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2594921122453717763 len:772 00:10:11.442 [2024-06-11 12:04:24.415620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.442 #43 NEW cov: 11742 ft: 14366 corp: 26/520b lim: 50 exec/s: 43 rss: 70Mb L: 18/46 MS: 1 ChangeByte- 00:10:11.702 [2024-06-11 12:04:24.475866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020518514240515 len:772 00:10:11.702 [2024-06-11 12:04:24.475904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.702 [2024-06-11 12:04:24.475949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:217020518514230019 len:779 00:10:11.702 [2024-06-11 12:04:24.475970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.702 #44 NEW cov: 11742 ft: 14384 corp: 27/543b lim: 50 exec/s: 44 rss: 70Mb L: 23/46 MS: 1 InsertByte- 00:10:11.702 [2024-06-11 12:04:24.525902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3171381874069275395 len:772 00:10:11.702 [2024-06-11 12:04:24.525939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.702 #45 NEW cov: 11742 ft: 14392 corp: 28/561b lim: 50 exec/s: 45 rss: 70Mb L: 18/46 MS: 1 ShuffleBytes- 00:10:11.702 [2024-06-11 12:04:24.586050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18229723551689342460 len:64757 00:10:11.702 [2024-06-11 12:04:24.586092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.702 #46 NEW cov: 11742 ft: 14408 corp: 29/579b lim: 50 exec/s: 46 rss: 70Mb L: 18/46 MS: 1 ChangeBinInt- 00:10:11.702 [2024-06-11 12:04:24.626298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020518765888259 len:772 00:10:11.702 [2024-06-11 12:04:24.626336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.702 [2024-06-11 12:04:24.626399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:281475027239698 len:1 00:10:11.702 [2024-06-11 12:04:24.626423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.702 #47 NEW cov: 11742 ft: 14419 corp: 30/605b lim: 50 exec/s: 47 rss: 70Mb L: 26/46 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:10:11.702 [2024-06-11 12:04:24.686461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:217020519558491139 len:772 00:10:11.702 [2024-06-11 12:04:24.686498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:11.702 [2024-06-11 12:04:24.686559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:221239370399810307 len:45 00:10:11.702 [2024-06-11 12:04:24.686581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:11.702 #48 NEW cov: 11742 ft: 14424 corp: 31/634b lim: 50 exec/s: 24 rss: 70Mb L: 29/46 MS: 1 CrossOver- 00:10:11.702 #48 DONE cov: 11742 ft: 14424 corp: 31/634b lim: 50 exec/s: 24 rss: 70Mb 00:10:11.702 ###### Recommended dictionary. ###### 00:10:11.702 "\022\000\000\000" # Uses: 3 00:10:11.702 "\377\377~u \017\313(" # Uses: 1 00:10:11.702 "\002\000" # Uses: 0 00:10:11.702 "\001\000\000\000\000\000\000\000" # Uses: 0 00:10:11.702 ###### End of recommended dictionary. ###### 00:10:11.702 Done 48 runs in 2 second(s) 00:10:11.961 12:04:24 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:10:11.961 12:04:24 -- ../common.sh@72 -- # (( i++ )) 00:10:11.961 12:04:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:11.961 12:04:24 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:10:11.961 12:04:24 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:10:11.961 12:04:24 -- nvmf/run.sh@24 -- # local timen=1 00:10:11.961 12:04:24 -- nvmf/run.sh@25 -- # local core=0x1 00:10:11.961 12:04:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:10:11.961 12:04:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:10:11.961 12:04:24 -- nvmf/run.sh@29 -- # printf %02d 20 00:10:11.961 12:04:24 -- nvmf/run.sh@29 -- # port=4420 00:10:11.961 12:04:24 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:10:11.961 12:04:24 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:10:11.961 12:04:24 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:11.961 12:04:24 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:10:11.961 [2024-06-11 12:04:24.895297] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:11.961 [2024-06-11 12:04:24.895373] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698134 ] 00:10:11.962 EAL: No free 2048 kB hugepages reported on node 1 00:10:12.276 [2024-06-11 12:04:25.150539] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.276 [2024-06-11 12:04:25.176702] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:12.276 [2024-06-11 12:04:25.176887] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.276 [2024-06-11 12:04:25.231430] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:12.276 [2024-06-11 12:04:25.247672] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:10:12.276 INFO: Running with entropic power schedule (0xFF, 100). 00:10:12.276 INFO: Seed: 2678374220 00:10:12.276 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:12.276 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:12.276 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:10:12.276 INFO: A corpus is not provided, starting from an empty corpus 00:10:12.276 #2 INITED exec/s: 0 rss: 61Mb 00:10:12.276 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:12.276 This may also happen if the target rejected all inputs we tried so far 00:10:12.556 [2024-06-11 12:04:25.303287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:12.556 [2024-06-11 12:04:25.303330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.556 [2024-06-11 12:04:25.303404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:12.556 [2024-06-11 12:04:25.303427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:12.815 NEW_FUNC[1/665]: 0x4c0800 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:10:12.815 NEW_FUNC[2/665]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:12.815 #5 NEW cov: 11571 ft: 11572 corp: 2/40b lim: 90 exec/s: 0 rss: 68Mb L: 39/39 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:10:12.815 [2024-06-11 12:04:25.774334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:12.815 [2024-06-11 12:04:25.774405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:12.815 #9 NEW cov: 11686 ft: 12907 corp: 3/67b lim: 90 exec/s: 0 rss: 68Mb L: 27/39 MS: 4 ChangeBit-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:10:12.815 [2024-06-11 12:04:25.834392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:12.815 [2024-06-11 12:04:25.834431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.075 #13 NEW cov: 11692 ft: 13121 corp: 4/97b lim: 90 exec/s: 0 rss: 68Mb L: 30/39 MS: 4 ChangeBit-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:10:13.075 [2024-06-11 12:04:25.884669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.075 [2024-06-11 12:04:25.884707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.075 [2024-06-11 12:04:25.884756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.075 [2024-06-11 12:04:25.884778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.075 #14 NEW cov: 11777 ft: 13356 corp: 5/144b lim: 90 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 CopyPart- 00:10:13.075 [2024-06-11 12:04:25.944693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.075 [2024-06-11 12:04:25.944730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.075 #15 NEW cov: 11777 ft: 13406 corp: 6/169b lim: 90 exec/s: 0 rss: 69Mb L: 25/47 MS: 1 CrossOver- 00:10:13.075 [2024-06-11 12:04:25.994863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.075 [2024-06-11 12:04:25.994900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.075 #16 NEW cov: 11777 ft: 13472 corp: 7/203b lim: 90 exec/s: 0 rss: 69Mb L: 34/47 MS: 1 CrossOver- 00:10:13.075 [2024-06-11 12:04:26.054991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.075 [2024-06-11 12:04:26.055029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.075 #17 NEW cov: 11777 ft: 13530 corp: 8/234b lim: 90 exec/s: 0 rss: 69Mb L: 31/47 MS: 1 InsertByte- 00:10:13.335 [2024-06-11 12:04:26.115218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.335 [2024-06-11 12:04:26.115255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.335 #18 NEW cov: 11777 ft: 13602 corp: 9/265b lim: 90 exec/s: 0 rss: 69Mb L: 31/47 MS: 1 InsertByte- 00:10:13.335 [2024-06-11 12:04:26.165290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.335 [2024-06-11 12:04:26.165328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.335 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:13.335 #24 NEW cov: 11800 ft: 13647 corp: 10/290b lim: 90 exec/s: 0 rss: 69Mb L: 25/47 MS: 1 CrossOver- 00:10:13.335 [2024-06-11 12:04:26.215655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.335 [2024-06-11 12:04:26.215693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.335 [2024-06-11 12:04:26.215757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.335 [2024-06-11 12:04:26.215780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.335 #25 NEW cov: 11800 ft: 13739 corp: 11/330b lim: 90 exec/s: 0 rss: 69Mb L: 40/47 MS: 1 InsertByte- 00:10:13.335 [2024-06-11 12:04:26.276003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.335 [2024-06-11 12:04:26.276040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.335 [2024-06-11 12:04:26.276089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.335 [2024-06-11 12:04:26.276111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.335 [2024-06-11 12:04:26.276179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:13.335 [2024-06-11 12:04:26.276200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:13.335 #26 NEW cov: 11800 ft: 14068 corp: 12/395b lim: 90 exec/s: 26 rss: 69Mb L: 65/65 MS: 1 CrossOver- 00:10:13.335 [2024-06-11 12:04:26.335948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.335 [2024-06-11 12:04:26.335985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.335 [2024-06-11 12:04:26.336048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.335 [2024-06-11 12:04:26.336071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.595 #27 NEW cov: 11800 ft: 14107 corp: 13/434b lim: 90 exec/s: 27 rss: 69Mb L: 39/65 MS: 1 CrossOver- 00:10:13.595 [2024-06-11 12:04:26.396508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.595 [2024-06-11 12:04:26.396546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.595 [2024-06-11 12:04:26.396607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.595 [2024-06-11 12:04:26.396629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.595 [2024-06-11 12:04:26.396698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:13.595 [2024-06-11 12:04:26.396720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:13.595 [2024-06-11 12:04:26.396789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:13.595 [2024-06-11 12:04:26.396811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:13.595 #28 NEW cov: 11800 ft: 14473 corp: 14/517b lim: 90 exec/s: 28 rss: 69Mb L: 83/83 MS: 1 CopyPart- 00:10:13.595 [2024-06-11 12:04:26.456527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.595 [2024-06-11 12:04:26.456565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.595 [2024-06-11 12:04:26.456616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.595 [2024-06-11 12:04:26.456639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.595 [2024-06-11 12:04:26.456708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:13.595 [2024-06-11 12:04:26.456729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:13.595 #29 NEW cov: 11800 ft: 14487 corp: 15/571b lim: 90 exec/s: 29 rss: 69Mb L: 54/83 MS: 1 CopyPart- 00:10:13.595 [2024-06-11 12:04:26.506350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.595 [2024-06-11 12:04:26.506390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.595 #30 NEW cov: 11800 ft: 14501 corp: 16/598b lim: 90 exec/s: 30 rss: 69Mb L: 27/83 MS: 1 ChangeBit- 00:10:13.595 [2024-06-11 12:04:26.556414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.595 [2024-06-11 12:04:26.556450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.595 #31 NEW cov: 11800 ft: 14556 corp: 17/625b lim: 90 exec/s: 31 rss: 70Mb L: 27/83 MS: 1 ChangeByte- 00:10:13.595 [2024-06-11 12:04:26.606603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.595 [2024-06-11 12:04:26.606638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.854 #32 NEW cov: 11800 ft: 14574 corp: 18/652b lim: 90 exec/s: 32 rss: 70Mb L: 27/83 MS: 1 ChangeBit- 00:10:13.854 [2024-06-11 12:04:26.666773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.854 [2024-06-11 12:04:26.666809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.854 #33 NEW cov: 11800 ft: 14583 corp: 19/682b lim: 90 exec/s: 33 rss: 70Mb L: 30/83 MS: 1 ChangeBit- 00:10:13.854 [2024-06-11 12:04:26.716945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.854 [2024-06-11 12:04:26.716982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.854 #34 NEW cov: 11800 ft: 14687 corp: 20/713b lim: 90 exec/s: 34 rss: 70Mb L: 31/83 MS: 1 ChangeByte- 00:10:13.854 [2024-06-11 12:04:26.777260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.854 [2024-06-11 12:04:26.777296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.854 [2024-06-11 12:04:26.777349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.854 [2024-06-11 12:04:26.777377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.854 #35 NEW cov: 11800 ft: 14701 corp: 21/753b lim: 90 exec/s: 35 rss: 70Mb L: 40/83 MS: 1 CopyPart- 00:10:13.854 [2024-06-11 12:04:26.837795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:13.854 [2024-06-11 12:04:26.837832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:13.854 [2024-06-11 12:04:26.837893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:13.854 [2024-06-11 12:04:26.837915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:13.854 [2024-06-11 12:04:26.837984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:13.854 [2024-06-11 12:04:26.838006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:13.854 [2024-06-11 12:04:26.838073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:10:13.854 [2024-06-11 12:04:26.838095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:13.854 #36 NEW cov: 11800 ft: 14713 corp: 22/832b lim: 90 exec/s: 36 rss: 70Mb L: 79/83 MS: 1 CopyPart- 00:10:14.114 [2024-06-11 12:04:26.887785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.114 [2024-06-11 12:04:26.887823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:26.887872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.114 [2024-06-11 12:04:26.887894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:26.887961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.114 [2024-06-11 12:04:26.887983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.114 #37 NEW cov: 11800 ft: 14729 corp: 23/886b lim: 90 exec/s: 37 rss: 70Mb L: 54/83 MS: 1 ChangeByte- 00:10:14.114 [2024-06-11 12:04:26.947757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.114 [2024-06-11 12:04:26.947793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:26.947848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.114 [2024-06-11 12:04:26.947870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.114 #38 NEW cov: 11800 ft: 14761 corp: 24/925b lim: 90 exec/s: 38 rss: 70Mb L: 39/83 MS: 1 ChangeBit- 00:10:14.114 [2024-06-11 12:04:26.998069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.114 [2024-06-11 12:04:26.998106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:26.998163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.114 [2024-06-11 12:04:26.998185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:26.998251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.114 [2024-06-11 12:04:26.998273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.114 #39 NEW cov: 11800 ft: 14765 corp: 25/988b lim: 90 exec/s: 39 rss: 70Mb L: 63/83 MS: 1 CopyPart- 00:10:14.114 [2024-06-11 12:04:27.047866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.114 [2024-06-11 12:04:27.047902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.114 #40 NEW cov: 11800 ft: 14813 corp: 26/1015b lim: 90 exec/s: 40 rss: 70Mb L: 27/83 MS: 1 CMP- DE: "\377\377"- 00:10:14.114 [2024-06-11 12:04:27.098341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.114 [2024-06-11 12:04:27.098387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:27.098443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.114 [2024-06-11 12:04:27.098465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.114 [2024-06-11 12:04:27.098535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.114 [2024-06-11 12:04:27.098558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.114 #41 NEW cov: 11800 ft: 14825 corp: 27/1078b lim: 90 exec/s: 41 rss: 70Mb L: 63/83 MS: 1 PersAutoDict- DE: "\377\377"- 00:10:14.373 [2024-06-11 12:04:27.158196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.373 [2024-06-11 12:04:27.158232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.373 #42 NEW cov: 11800 ft: 14846 corp: 28/1108b lim: 90 exec/s: 42 rss: 70Mb L: 30/83 MS: 1 PersAutoDict- DE: "\377\377"- 00:10:14.373 [2024-06-11 12:04:27.218376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.373 [2024-06-11 12:04:27.218414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.373 #43 NEW cov: 11800 ft: 14871 corp: 29/1135b lim: 90 exec/s: 43 rss: 70Mb L: 27/83 MS: 1 ChangeByte- 00:10:14.373 [2024-06-11 12:04:27.278908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:10:14.373 [2024-06-11 12:04:27.278946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.373 [2024-06-11 12:04:27.278995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:10:14.373 [2024-06-11 12:04:27.279017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:14.373 [2024-06-11 12:04:27.279086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:10:14.373 [2024-06-11 12:04:27.279108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:14.373 #44 NEW cov: 11800 ft: 14879 corp: 30/1199b lim: 90 exec/s: 22 rss: 70Mb L: 64/83 MS: 1 InsertByte- 00:10:14.373 #44 DONE cov: 11800 ft: 14879 corp: 30/1199b lim: 90 exec/s: 22 rss: 70Mb 00:10:14.373 ###### Recommended dictionary. ###### 00:10:14.373 "\377\377" # Uses: 2 00:10:14.373 ###### End of recommended dictionary. ###### 00:10:14.374 Done 44 runs in 2 second(s) 00:10:14.633 12:04:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:10:14.633 12:04:27 -- ../common.sh@72 -- # (( i++ )) 00:10:14.633 12:04:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:14.633 12:04:27 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:10:14.633 12:04:27 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:10:14.633 12:04:27 -- nvmf/run.sh@24 -- # local timen=1 00:10:14.633 12:04:27 -- nvmf/run.sh@25 -- # local core=0x1 00:10:14.633 12:04:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:10:14.633 12:04:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:10:14.634 12:04:27 -- nvmf/run.sh@29 -- # printf %02d 21 00:10:14.634 12:04:27 -- nvmf/run.sh@29 -- # port=4421 00:10:14.634 12:04:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:10:14.634 12:04:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:10:14.634 12:04:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:14.634 12:04:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:10:14.634 [2024-06-11 12:04:27.488446] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:14.634 [2024-06-11 12:04:27.488516] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698452 ] 00:10:14.634 EAL: No free 2048 kB hugepages reported on node 1 00:10:14.893 [2024-06-11 12:04:27.745017] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.893 [2024-06-11 12:04:27.771247] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:14.893 [2024-06-11 12:04:27.771423] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.893 [2024-06-11 12:04:27.825881] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:14.893 [2024-06-11 12:04:27.842122] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:10:14.893 INFO: Running with entropic power schedule (0xFF, 100). 00:10:14.893 INFO: Seed: 976399161 00:10:14.893 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:14.893 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:14.893 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:10:14.893 INFO: A corpus is not provided, starting from an empty corpus 00:10:14.893 #2 INITED exec/s: 0 rss: 61Mb 00:10:14.893 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:14.893 This may also happen if the target rejected all inputs we tried so far 00:10:14.893 [2024-06-11 12:04:27.900634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:14.893 [2024-06-11 12:04:27.900683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:14.893 [2024-06-11 12:04:27.900737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:14.893 [2024-06-11 12:04:27.900764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.411 NEW_FUNC[1/665]: 0x4c3a20 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:10:15.411 NEW_FUNC[2/665]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:15.411 #6 NEW cov: 11545 ft: 11546 corp: 2/27b lim: 50 exec/s: 0 rss: 68Mb L: 26/26 MS: 4 CrossOver-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:10:15.411 [2024-06-11 12:04:28.261521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.411 [2024-06-11 12:04:28.261579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.411 [2024-06-11 12:04:28.261637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.411 [2024-06-11 12:04:28.261664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.411 #7 NEW cov: 11661 ft: 11929 corp: 3/50b lim: 50 exec/s: 0 rss: 69Mb L: 23/26 MS: 1 EraseBytes- 00:10:15.411 [2024-06-11 12:04:28.361641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.411 [2024-06-11 12:04:28.361684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.411 [2024-06-11 12:04:28.361735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.411 [2024-06-11 12:04:28.361762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.411 #8 NEW cov: 11667 ft: 12325 corp: 4/77b lim: 50 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 CopyPart- 00:10:15.412 [2024-06-11 12:04:28.431801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.412 [2024-06-11 12:04:28.431843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.412 [2024-06-11 12:04:28.431893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.412 [2024-06-11 12:04:28.431920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.670 #9 NEW cov: 11752 ft: 12523 corp: 5/103b lim: 50 exec/s: 0 rss: 69Mb L: 26/27 MS: 1 ChangeBit- 00:10:15.670 [2024-06-11 12:04:28.502232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.670 [2024-06-11 12:04:28.502276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.670 [2024-06-11 12:04:28.502326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.670 [2024-06-11 12:04:28.502353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.670 [2024-06-11 12:04:28.502409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:15.670 [2024-06-11 12:04:28.502433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.670 [2024-06-11 12:04:28.502476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:15.670 [2024-06-11 12:04:28.502500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:15.670 #10 NEW cov: 11752 ft: 12961 corp: 6/150b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:10:15.670 [2024-06-11 12:04:28.582224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.671 [2024-06-11 12:04:28.582269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.671 [2024-06-11 12:04:28.582321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.671 [2024-06-11 12:04:28.582348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.671 #11 NEW cov: 11752 ft: 13104 corp: 7/177b lim: 50 exec/s: 0 rss: 69Mb L: 27/47 MS: 1 CrossOver- 00:10:15.671 [2024-06-11 12:04:28.672444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.671 [2024-06-11 12:04:28.672492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.930 #17 NEW cov: 11752 ft: 13919 corp: 8/195b lim: 50 exec/s: 0 rss: 69Mb L: 18/47 MS: 1 EraseBytes- 00:10:15.930 [2024-06-11 12:04:28.772835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.930 [2024-06-11 12:04:28.772878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.930 [2024-06-11 12:04:28.772930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.930 [2024-06-11 12:04:28.772957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.930 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:15.930 #18 NEW cov: 11769 ft: 13960 corp: 9/221b lim: 50 exec/s: 0 rss: 69Mb L: 26/47 MS: 1 InsertRepeatedBytes- 00:10:15.930 [2024-06-11 12:04:28.843002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.930 [2024-06-11 12:04:28.843044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.930 [2024-06-11 12:04:28.843094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.930 [2024-06-11 12:04:28.843122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.930 #19 NEW cov: 11769 ft: 13991 corp: 10/247b lim: 50 exec/s: 19 rss: 69Mb L: 26/47 MS: 1 ShuffleBytes- 00:10:15.930 [2024-06-11 12:04:28.923379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:15.930 [2024-06-11 12:04:28.923422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:15.930 [2024-06-11 12:04:28.923471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:15.930 [2024-06-11 12:04:28.923498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:15.930 [2024-06-11 12:04:28.923543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:15.930 [2024-06-11 12:04:28.923567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:15.930 [2024-06-11 12:04:28.923610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:15.930 [2024-06-11 12:04:28.923634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.190 #20 NEW cov: 11769 ft: 14098 corp: 11/294b lim: 50 exec/s: 20 rss: 69Mb L: 47/47 MS: 1 ChangeBinInt- 00:10:16.190 [2024-06-11 12:04:29.013320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.190 [2024-06-11 12:04:29.013369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.190 #21 NEW cov: 11769 ft: 14140 corp: 12/310b lim: 50 exec/s: 21 rss: 69Mb L: 16/47 MS: 1 EraseBytes- 00:10:16.190 [2024-06-11 12:04:29.103895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.190 [2024-06-11 12:04:29.103937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.190 #22 NEW cov: 11769 ft: 14151 corp: 13/328b lim: 50 exec/s: 22 rss: 69Mb L: 18/47 MS: 1 EraseBytes- 00:10:16.190 [2024-06-11 12:04:29.174319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.190 [2024-06-11 12:04:29.174372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.190 [2024-06-11 12:04:29.174428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.190 [2024-06-11 12:04:29.174455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.190 [2024-06-11 12:04:29.174501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.190 [2024-06-11 12:04:29.174526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.190 [2024-06-11 12:04:29.174570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.190 [2024-06-11 12:04:29.174594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.449 #23 NEW cov: 11769 ft: 14211 corp: 14/369b lim: 50 exec/s: 23 rss: 69Mb L: 41/47 MS: 1 InsertRepeatedBytes- 00:10:16.449 [2024-06-11 12:04:29.264417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.449 [2024-06-11 12:04:29.264461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.449 [2024-06-11 12:04:29.264515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.449 [2024-06-11 12:04:29.264542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.449 #24 NEW cov: 11769 ft: 14282 corp: 15/392b lim: 50 exec/s: 24 rss: 69Mb L: 23/47 MS: 1 ChangeByte- 00:10:16.449 [2024-06-11 12:04:29.344830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.449 [2024-06-11 12:04:29.344873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.449 [2024-06-11 12:04:29.344923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.449 [2024-06-11 12:04:29.344949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.449 [2024-06-11 12:04:29.344995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.449 [2024-06-11 12:04:29.345019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.449 [2024-06-11 12:04:29.345063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.449 [2024-06-11 12:04:29.345087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.450 #25 NEW cov: 11769 ft: 14294 corp: 16/439b lim: 50 exec/s: 25 rss: 69Mb L: 47/47 MS: 1 ChangeBinInt- 00:10:16.450 [2024-06-11 12:04:29.415012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.450 [2024-06-11 12:04:29.415053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.450 [2024-06-11 12:04:29.415102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.450 [2024-06-11 12:04:29.415128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.450 [2024-06-11 12:04:29.415175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.450 [2024-06-11 12:04:29.415200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.450 [2024-06-11 12:04:29.415244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.450 [2024-06-11 12:04:29.415273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.450 #26 NEW cov: 11769 ft: 14312 corp: 17/486b lim: 50 exec/s: 26 rss: 70Mb L: 47/47 MS: 1 CopyPart- 00:10:16.708 [2024-06-11 12:04:29.485199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.708 [2024-06-11 12:04:29.485240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.708 [2024-06-11 12:04:29.485288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.708 [2024-06-11 12:04:29.485315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.708 [2024-06-11 12:04:29.485368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.708 [2024-06-11 12:04:29.485393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.708 [2024-06-11 12:04:29.485438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.708 [2024-06-11 12:04:29.485462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.708 #27 NEW cov: 11769 ft: 14320 corp: 18/533b lim: 50 exec/s: 27 rss: 70Mb L: 47/47 MS: 1 ShuffleBytes- 00:10:16.709 [2024-06-11 12:04:29.575269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.709 [2024-06-11 12:04:29.575310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.709 [2024-06-11 12:04:29.575370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.709 [2024-06-11 12:04:29.575398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.709 #28 NEW cov: 11769 ft: 14361 corp: 19/559b lim: 50 exec/s: 28 rss: 70Mb L: 26/47 MS: 1 CMP- DE: "3\036\235p.\225\017\000"- 00:10:16.709 [2024-06-11 12:04:29.645648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.709 [2024-06-11 12:04:29.645689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.709 [2024-06-11 12:04:29.645739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.709 [2024-06-11 12:04:29.645765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.709 [2024-06-11 12:04:29.645810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.709 [2024-06-11 12:04:29.645835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.709 [2024-06-11 12:04:29.645878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.709 [2024-06-11 12:04:29.645902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.709 #29 NEW cov: 11769 ft: 14451 corp: 20/606b lim: 50 exec/s: 29 rss: 70Mb L: 47/47 MS: 1 PersAutoDict- DE: "3\036\235p.\225\017\000"- 00:10:16.709 [2024-06-11 12:04:29.715600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.709 [2024-06-11 12:04:29.715641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.709 [2024-06-11 12:04:29.715692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.709 [2024-06-11 12:04:29.715719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.968 #35 NEW cov: 11776 ft: 14461 corp: 21/633b lim: 50 exec/s: 35 rss: 70Mb L: 27/47 MS: 1 InsertByte- 00:10:16.968 [2024-06-11 12:04:29.805902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.968 [2024-06-11 12:04:29.805944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.968 [2024-06-11 12:04:29.805995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.968 [2024-06-11 12:04:29.806022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.968 #36 NEW cov: 11776 ft: 14473 corp: 22/660b lim: 50 exec/s: 36 rss: 70Mb L: 27/47 MS: 1 InsertByte- 00:10:16.968 [2024-06-11 12:04:29.876273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:10:16.968 [2024-06-11 12:04:29.876314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:16.968 [2024-06-11 12:04:29.876372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:10:16.968 [2024-06-11 12:04:29.876399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:16.968 [2024-06-11 12:04:29.876446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:10:16.968 [2024-06-11 12:04:29.876470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:16.968 [2024-06-11 12:04:29.876513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:10:16.968 [2024-06-11 12:04:29.876537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:16.968 #37 NEW cov: 11776 ft: 14502 corp: 23/702b lim: 50 exec/s: 18 rss: 70Mb L: 42/47 MS: 1 CrossOver- 00:10:16.968 #37 DONE cov: 11776 ft: 14502 corp: 23/702b lim: 50 exec/s: 18 rss: 70Mb 00:10:16.968 ###### Recommended dictionary. ###### 00:10:16.968 "3\036\235p.\225\017\000" # Uses: 1 00:10:16.968 ###### End of recommended dictionary. ###### 00:10:16.968 Done 37 runs in 2 second(s) 00:10:17.227 12:04:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:10:17.227 12:04:30 -- ../common.sh@72 -- # (( i++ )) 00:10:17.227 12:04:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:17.227 12:04:30 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:10:17.227 12:04:30 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:10:17.227 12:04:30 -- nvmf/run.sh@24 -- # local timen=1 00:10:17.227 12:04:30 -- nvmf/run.sh@25 -- # local core=0x1 00:10:17.227 12:04:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:10:17.227 12:04:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:10:17.227 12:04:30 -- nvmf/run.sh@29 -- # printf %02d 22 00:10:17.227 12:04:30 -- nvmf/run.sh@29 -- # port=4422 00:10:17.227 12:04:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:10:17.227 12:04:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:10:17.227 12:04:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:17.227 12:04:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:10:17.227 [2024-06-11 12:04:30.112588] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:17.227 [2024-06-11 12:04:30.112658] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698786 ] 00:10:17.227 EAL: No free 2048 kB hugepages reported on node 1 00:10:17.486 [2024-06-11 12:04:30.357267] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:17.486 [2024-06-11 12:04:30.383495] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:17.486 [2024-06-11 12:04:30.383670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:17.486 [2024-06-11 12:04:30.438174] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:17.486 [2024-06-11 12:04:30.454412] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:10:17.486 INFO: Running with entropic power schedule (0xFF, 100). 00:10:17.486 INFO: Seed: 3589405213 00:10:17.486 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:17.486 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:17.486 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:10:17.486 INFO: A corpus is not provided, starting from an empty corpus 00:10:17.486 #2 INITED exec/s: 0 rss: 61Mb 00:10:17.486 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:17.486 This may also happen if the target rejected all inputs we tried so far 00:10:17.745 [2024-06-11 12:04:30.520025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:17.745 [2024-06-11 12:04:30.520065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:17.745 [2024-06-11 12:04:30.520133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:17.745 [2024-06-11 12:04:30.520156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.004 NEW_FUNC[1/665]: 0x4c5ce0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:10:18.004 NEW_FUNC[2/665]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:18.004 #10 NEW cov: 11574 ft: 11575 corp: 2/37b lim: 85 exec/s: 0 rss: 68Mb L: 36/36 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:10:18.004 [2024-06-11 12:04:30.991714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.004 [2024-06-11 12:04:30.991776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.004 [2024-06-11 12:04:30.991860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.004 [2024-06-11 12:04:30.991890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.004 [2024-06-11 12:04:30.991965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:18.004 [2024-06-11 12:04:30.991994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.004 [2024-06-11 12:04:30.992074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:18.004 [2024-06-11 12:04:30.992103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:18.262 #16 NEW cov: 11687 ft: 12532 corp: 3/120b lim: 85 exec/s: 0 rss: 69Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:10:18.262 [2024-06-11 12:04:31.051354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.262 [2024-06-11 12:04:31.051398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.262 [2024-06-11 12:04:31.051444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.262 [2024-06-11 12:04:31.051470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.262 #17 NEW cov: 11693 ft: 12820 corp: 4/156b lim: 85 exec/s: 0 rss: 69Mb L: 36/83 MS: 1 ShuffleBytes- 00:10:18.262 [2024-06-11 12:04:31.101206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.262 [2024-06-11 12:04:31.101242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.262 #18 NEW cov: 11778 ft: 13777 corp: 5/185b lim: 85 exec/s: 0 rss: 69Mb L: 29/83 MS: 1 EraseBytes- 00:10:18.262 [2024-06-11 12:04:31.151936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.262 [2024-06-11 12:04:31.151973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.262 [2024-06-11 12:04:31.152034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.262 [2024-06-11 12:04:31.152056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.262 [2024-06-11 12:04:31.152119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:18.262 [2024-06-11 12:04:31.152140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.262 [2024-06-11 12:04:31.152207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:18.262 [2024-06-11 12:04:31.152228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:18.262 #19 NEW cov: 11778 ft: 13851 corp: 6/269b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 InsertByte- 00:10:18.262 [2024-06-11 12:04:31.211706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.262 [2024-06-11 12:04:31.211743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.262 [2024-06-11 12:04:31.211789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.262 [2024-06-11 12:04:31.211810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.262 #20 NEW cov: 11778 ft: 13930 corp: 7/306b lim: 85 exec/s: 0 rss: 69Mb L: 37/84 MS: 1 InsertByte- 00:10:18.263 [2024-06-11 12:04:31.271909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.263 [2024-06-11 12:04:31.271947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.263 [2024-06-11 12:04:31.271995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.263 [2024-06-11 12:04:31.272016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.521 #21 NEW cov: 11778 ft: 13991 corp: 8/342b lim: 85 exec/s: 0 rss: 69Mb L: 36/84 MS: 1 ChangeBit- 00:10:18.521 [2024-06-11 12:04:31.322082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.521 [2024-06-11 12:04:31.322121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.521 [2024-06-11 12:04:31.322185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.521 [2024-06-11 12:04:31.322205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.521 #22 NEW cov: 11778 ft: 14088 corp: 9/379b lim: 85 exec/s: 0 rss: 69Mb L: 37/84 MS: 1 ChangeBinInt- 00:10:18.521 [2024-06-11 12:04:31.382570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.521 [2024-06-11 12:04:31.382611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.521 [2024-06-11 12:04:31.382658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.521 [2024-06-11 12:04:31.382680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.521 [2024-06-11 12:04:31.382744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:18.521 [2024-06-11 12:04:31.382765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.521 [2024-06-11 12:04:31.382830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:18.521 [2024-06-11 12:04:31.382852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:18.521 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:18.521 #23 NEW cov: 11801 ft: 14165 corp: 10/462b lim: 85 exec/s: 0 rss: 69Mb L: 83/84 MS: 1 ChangeByte- 00:10:18.521 [2024-06-11 12:04:31.432334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.521 [2024-06-11 12:04:31.432378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.521 [2024-06-11 12:04:31.432427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.521 [2024-06-11 12:04:31.432450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.521 #24 NEW cov: 11801 ft: 14194 corp: 11/499b lim: 85 exec/s: 0 rss: 69Mb L: 37/84 MS: 1 ChangeBinInt- 00:10:18.521 [2024-06-11 12:04:31.492395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.521 [2024-06-11 12:04:31.492433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.521 #28 NEW cov: 11801 ft: 14237 corp: 12/528b lim: 85 exec/s: 28 rss: 69Mb L: 29/84 MS: 4 ShuffleBytes-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:10:18.521 [2024-06-11 12:04:31.542542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.521 [2024-06-11 12:04:31.542579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.780 #29 NEW cov: 11801 ft: 14260 corp: 13/553b lim: 85 exec/s: 29 rss: 69Mb L: 25/84 MS: 1 EraseBytes- 00:10:18.780 [2024-06-11 12:04:31.602687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.780 [2024-06-11 12:04:31.602724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.780 #30 NEW cov: 11801 ft: 14278 corp: 14/582b lim: 85 exec/s: 30 rss: 69Mb L: 29/84 MS: 1 ShuffleBytes- 00:10:18.780 [2024-06-11 12:04:31.653011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.780 [2024-06-11 12:04:31.653048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.780 [2024-06-11 12:04:31.653111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.780 [2024-06-11 12:04:31.653132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.780 #31 NEW cov: 11801 ft: 14297 corp: 15/619b lim: 85 exec/s: 31 rss: 69Mb L: 37/84 MS: 1 ChangeBit- 00:10:18.780 [2024-06-11 12:04:31.703304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.780 [2024-06-11 12:04:31.703345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.780 [2024-06-11 12:04:31.703401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.780 [2024-06-11 12:04:31.703423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.780 [2024-06-11 12:04:31.703485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:18.780 [2024-06-11 12:04:31.703507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.780 #32 NEW cov: 11801 ft: 14583 corp: 16/683b lim: 85 exec/s: 32 rss: 70Mb L: 64/84 MS: 1 InsertRepeatedBytes- 00:10:18.780 [2024-06-11 12:04:31.763682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:18.780 [2024-06-11 12:04:31.763719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:18.780 [2024-06-11 12:04:31.763772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:18.780 [2024-06-11 12:04:31.763793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:18.780 [2024-06-11 12:04:31.763856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:18.780 [2024-06-11 12:04:31.763876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:18.780 [2024-06-11 12:04:31.763940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:18.780 [2024-06-11 12:04:31.763962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:18.780 #33 NEW cov: 11801 ft: 14604 corp: 17/765b lim: 85 exec/s: 33 rss: 70Mb L: 82/84 MS: 1 EraseBytes- 00:10:19.039 [2024-06-11 12:04:31.823843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.039 [2024-06-11 12:04:31.823881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.823934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.039 [2024-06-11 12:04:31.823955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.824020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.039 [2024-06-11 12:04:31.824042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.824108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.039 [2024-06-11 12:04:31.824129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.039 #34 NEW cov: 11801 ft: 14618 corp: 18/843b lim: 85 exec/s: 34 rss: 70Mb L: 78/84 MS: 1 InsertRepeatedBytes- 00:10:19.039 [2024-06-11 12:04:31.874049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.039 [2024-06-11 12:04:31.874086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.874145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.039 [2024-06-11 12:04:31.874167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.874232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.039 [2024-06-11 12:04:31.874257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.874322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.039 [2024-06-11 12:04:31.874344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.039 #35 NEW cov: 11801 ft: 14664 corp: 19/921b lim: 85 exec/s: 35 rss: 70Mb L: 78/84 MS: 1 CopyPart- 00:10:19.039 [2024-06-11 12:04:31.934148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.039 [2024-06-11 12:04:31.934185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.934239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.039 [2024-06-11 12:04:31.934261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.934327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.039 [2024-06-11 12:04:31.934349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.934418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.039 [2024-06-11 12:04:31.934440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.039 #36 NEW cov: 11801 ft: 14669 corp: 20/997b lim: 85 exec/s: 36 rss: 70Mb L: 76/84 MS: 1 InsertRepeatedBytes- 00:10:19.039 [2024-06-11 12:04:31.983932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.039 [2024-06-11 12:04:31.983970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:31.984021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.039 [2024-06-11 12:04:31.984043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.039 #37 NEW cov: 11801 ft: 14699 corp: 21/1034b lim: 85 exec/s: 37 rss: 70Mb L: 37/84 MS: 1 ChangeBit- 00:10:19.039 [2024-06-11 12:04:32.044484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.039 [2024-06-11 12:04:32.044521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:32.044582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.039 [2024-06-11 12:04:32.044604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:32.044668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:10:19.039 [2024-06-11 12:04:32.044690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:19.039 [2024-06-11 12:04:32.044756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:10:19.039 [2024-06-11 12:04:32.044777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:19.303 #38 NEW cov: 11801 ft: 14736 corp: 22/1112b lim: 85 exec/s: 38 rss: 70Mb L: 78/84 MS: 1 ChangeBinInt- 00:10:19.303 [2024-06-11 12:04:32.104183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.303 [2024-06-11 12:04:32.104220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.303 #39 NEW cov: 11801 ft: 14756 corp: 23/1140b lim: 85 exec/s: 39 rss: 70Mb L: 28/84 MS: 1 CrossOver- 00:10:19.303 [2024-06-11 12:04:32.154262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.303 [2024-06-11 12:04:32.154297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.303 #40 NEW cov: 11801 ft: 14808 corp: 24/1169b lim: 85 exec/s: 40 rss: 70Mb L: 29/84 MS: 1 ChangeBinInt- 00:10:19.303 [2024-06-11 12:04:32.214631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.303 [2024-06-11 12:04:32.214668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.303 [2024-06-11 12:04:32.214718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.303 [2024-06-11 12:04:32.214740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.303 #41 NEW cov: 11801 ft: 14814 corp: 25/1216b lim: 85 exec/s: 41 rss: 70Mb L: 47/84 MS: 1 CrossOver- 00:10:19.303 [2024-06-11 12:04:32.274632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.303 [2024-06-11 12:04:32.274668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.303 #42 NEW cov: 11801 ft: 14897 corp: 26/1245b lim: 85 exec/s: 42 rss: 70Mb L: 29/84 MS: 1 ChangeBit- 00:10:19.303 [2024-06-11 12:04:32.324882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.303 [2024-06-11 12:04:32.324918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.303 [2024-06-11 12:04:32.324973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.303 [2024-06-11 12:04:32.324994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.567 #43 NEW cov: 11801 ft: 14961 corp: 27/1281b lim: 85 exec/s: 43 rss: 70Mb L: 36/84 MS: 1 ChangeBinInt- 00:10:19.567 [2024-06-11 12:04:32.385096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.567 [2024-06-11 12:04:32.385133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.567 [2024-06-11 12:04:32.385195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:10:19.567 [2024-06-11 12:04:32.385217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:19.567 #44 NEW cov: 11801 ft: 14972 corp: 28/1319b lim: 85 exec/s: 44 rss: 70Mb L: 38/84 MS: 1 EraseBytes- 00:10:19.567 [2024-06-11 12:04:32.445079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.567 [2024-06-11 12:04:32.445115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.567 #45 NEW cov: 11801 ft: 14979 corp: 29/1348b lim: 85 exec/s: 45 rss: 70Mb L: 29/84 MS: 1 ChangeByte- 00:10:19.567 [2024-06-11 12:04:32.495180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:10:19.567 [2024-06-11 12:04:32.495216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:19.567 #46 NEW cov: 11801 ft: 14995 corp: 30/1377b lim: 85 exec/s: 23 rss: 70Mb L: 29/84 MS: 1 ChangeByte- 00:10:19.567 #46 DONE cov: 11801 ft: 14995 corp: 30/1377b lim: 85 exec/s: 23 rss: 70Mb 00:10:19.567 Done 46 runs in 2 second(s) 00:10:19.826 12:04:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:10:19.826 12:04:32 -- ../common.sh@72 -- # (( i++ )) 00:10:19.826 12:04:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:19.826 12:04:32 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:10:19.826 12:04:32 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:10:19.826 12:04:32 -- nvmf/run.sh@24 -- # local timen=1 00:10:19.826 12:04:32 -- nvmf/run.sh@25 -- # local core=0x1 00:10:19.826 12:04:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:10:19.826 12:04:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:10:19.826 12:04:32 -- nvmf/run.sh@29 -- # printf %02d 23 00:10:19.826 12:04:32 -- nvmf/run.sh@29 -- # port=4423 00:10:19.826 12:04:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:10:19.826 12:04:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:10:19.826 12:04:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:19.826 12:04:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:10:19.826 [2024-06-11 12:04:32.704258] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:19.826 [2024-06-11 12:04:32.704325] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699092 ] 00:10:19.826 EAL: No free 2048 kB hugepages reported on node 1 00:10:20.085 [2024-06-11 12:04:32.950511] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:20.085 [2024-06-11 12:04:32.976776] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:20.085 [2024-06-11 12:04:32.976964] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:20.085 [2024-06-11 12:04:33.031389] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:20.085 [2024-06-11 12:04:33.047618] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:10:20.085 INFO: Running with entropic power schedule (0xFF, 100). 00:10:20.085 INFO: Seed: 1889436936 00:10:20.085 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:20.085 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:20.085 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:10:20.085 INFO: A corpus is not provided, starting from an empty corpus 00:10:20.085 #2 INITED exec/s: 0 rss: 61Mb 00:10:20.085 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:20.085 This may also happen if the target rejected all inputs we tried so far 00:10:20.085 [2024-06-11 12:04:33.102801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:20.085 [2024-06-11 12:04:33.102847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.085 [2024-06-11 12:04:33.102899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:20.085 [2024-06-11 12:04:33.102927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.085 [2024-06-11 12:04:33.102974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:20.085 [2024-06-11 12:04:33.102998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.603 NEW_FUNC[1/664]: 0x4c8f10 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:10:20.603 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:20.603 #7 NEW cov: 11507 ft: 11507 corp: 2/18b lim: 25 exec/s: 0 rss: 68Mb L: 17/17 MS: 5 ChangeByte-ChangeBinInt-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:10:20.603 [2024-06-11 12:04:33.603955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:20.603 [2024-06-11 12:04:33.604013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.603 [2024-06-11 12:04:33.604068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:20.603 [2024-06-11 12:04:33.604096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.862 #10 NEW cov: 11620 ft: 12233 corp: 3/28b lim: 25 exec/s: 0 rss: 68Mb L: 10/17 MS: 3 CopyPart-ChangeBit-CrossOver- 00:10:20.862 [2024-06-11 12:04:33.684113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:20.862 [2024-06-11 12:04:33.684160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.862 [2024-06-11 12:04:33.684211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:20.862 [2024-06-11 12:04:33.684238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.862 [2024-06-11 12:04:33.684284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:20.862 [2024-06-11 12:04:33.684309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.862 #11 NEW cov: 11626 ft: 12475 corp: 4/45b lim: 25 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 CopyPart- 00:10:20.862 [2024-06-11 12:04:33.774323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:20.862 [2024-06-11 12:04:33.774371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.862 [2024-06-11 12:04:33.774421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:20.862 [2024-06-11 12:04:33.774448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.862 [2024-06-11 12:04:33.774495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:20.862 [2024-06-11 12:04:33.774520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:20.862 #12 NEW cov: 11711 ft: 12692 corp: 5/62b lim: 25 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 ChangeByte- 00:10:20.862 [2024-06-11 12:04:33.864558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:20.862 [2024-06-11 12:04:33.864600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:20.862 [2024-06-11 12:04:33.864649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:20.863 [2024-06-11 12:04:33.864676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:20.863 [2024-06-11 12:04:33.864723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:20.863 [2024-06-11 12:04:33.864747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.121 #13 NEW cov: 11711 ft: 12751 corp: 6/79b lim: 25 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 ShuffleBytes- 00:10:21.121 [2024-06-11 12:04:33.934806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.121 [2024-06-11 12:04:33.934847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.121 [2024-06-11 12:04:33.934903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.121 [2024-06-11 12:04:33.934930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.122 [2024-06-11 12:04:33.934977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.122 [2024-06-11 12:04:33.935001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.122 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:21.122 #14 NEW cov: 11734 ft: 12820 corp: 7/96b lim: 25 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 ChangeBit- 00:10:21.122 [2024-06-11 12:04:34.025086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.122 [2024-06-11 12:04:34.025129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.122 [2024-06-11 12:04:34.025178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.122 [2024-06-11 12:04:34.025206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.122 [2024-06-11 12:04:34.025252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.122 [2024-06-11 12:04:34.025276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.122 [2024-06-11 12:04:34.025320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:21.122 [2024-06-11 12:04:34.025344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:21.122 #15 NEW cov: 11734 ft: 13382 corp: 8/119b lim: 25 exec/s: 15 rss: 69Mb L: 23/23 MS: 1 CrossOver- 00:10:21.122 [2024-06-11 12:04:34.105261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.122 [2024-06-11 12:04:34.105302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.122 [2024-06-11 12:04:34.105353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.122 [2024-06-11 12:04:34.105389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.122 [2024-06-11 12:04:34.105436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.122 [2024-06-11 12:04:34.105460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.381 #17 NEW cov: 11734 ft: 13408 corp: 9/134b lim: 25 exec/s: 17 rss: 69Mb L: 15/23 MS: 2 CopyPart-InsertRepeatedBytes- 00:10:21.381 [2024-06-11 12:04:34.175376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.381 [2024-06-11 12:04:34.175420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.175471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.381 [2024-06-11 12:04:34.175499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.381 #18 NEW cov: 11734 ft: 13476 corp: 10/144b lim: 25 exec/s: 18 rss: 69Mb L: 10/23 MS: 1 EraseBytes- 00:10:21.381 [2024-06-11 12:04:34.255842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.381 [2024-06-11 12:04:34.255884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.255939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.381 [2024-06-11 12:04:34.255966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.256012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.381 [2024-06-11 12:04:34.256036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.256080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:21.381 [2024-06-11 12:04:34.256105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.256149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:10:21.381 [2024-06-11 12:04:34.256174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:21.381 #19 NEW cov: 11734 ft: 13549 corp: 11/169b lim: 25 exec/s: 19 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:10:21.381 [2024-06-11 12:04:34.345869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.381 [2024-06-11 12:04:34.345910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.345959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.381 [2024-06-11 12:04:34.345985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.381 [2024-06-11 12:04:34.346032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.381 [2024-06-11 12:04:34.346056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.641 #20 NEW cov: 11734 ft: 13655 corp: 12/187b lim: 25 exec/s: 20 rss: 69Mb L: 18/25 MS: 1 InsertByte- 00:10:21.641 [2024-06-11 12:04:34.436179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.641 [2024-06-11 12:04:34.436221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.436270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.641 [2024-06-11 12:04:34.436296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.436343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.641 [2024-06-11 12:04:34.436374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.641 #21 NEW cov: 11734 ft: 13678 corp: 13/204b lim: 25 exec/s: 21 rss: 69Mb L: 17/25 MS: 1 ChangeBit- 00:10:21.641 [2024-06-11 12:04:34.506353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.641 [2024-06-11 12:04:34.506404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.506454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.641 [2024-06-11 12:04:34.506481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.506527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.641 [2024-06-11 12:04:34.506552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.641 #22 NEW cov: 11734 ft: 13703 corp: 14/222b lim: 25 exec/s: 22 rss: 69Mb L: 18/25 MS: 1 ChangeBit- 00:10:21.641 [2024-06-11 12:04:34.596757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.641 [2024-06-11 12:04:34.596799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.596847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.641 [2024-06-11 12:04:34.596875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.596921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.641 [2024-06-11 12:04:34.596945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.596988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:21.641 [2024-06-11 12:04:34.597012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:21.641 [2024-06-11 12:04:34.597055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:10:21.641 [2024-06-11 12:04:34.597080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:21.900 #23 NEW cov: 11734 ft: 13774 corp: 15/247b lim: 25 exec/s: 23 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:10:21.900 [2024-06-11 12:04:34.686871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.900 [2024-06-11 12:04:34.686914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.900 [2024-06-11 12:04:34.686963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.900 [2024-06-11 12:04:34.686990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.900 [2024-06-11 12:04:34.687036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.900 [2024-06-11 12:04:34.687061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.900 #24 NEW cov: 11734 ft: 13798 corp: 16/265b lim: 25 exec/s: 24 rss: 69Mb L: 18/25 MS: 1 InsertByte- 00:10:21.900 [2024-06-11 12:04:34.757204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.900 [2024-06-11 12:04:34.757247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.900 [2024-06-11 12:04:34.757295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.900 [2024-06-11 12:04:34.757322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.900 [2024-06-11 12:04:34.757376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.901 [2024-06-11 12:04:34.757401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.757445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:21.901 [2024-06-11 12:04:34.757469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.757512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:10:21.901 [2024-06-11 12:04:34.757536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:21.901 #25 NEW cov: 11734 ft: 13800 corp: 17/290b lim: 25 exec/s: 25 rss: 70Mb L: 25/25 MS: 1 ChangeBit- 00:10:21.901 [2024-06-11 12:04:34.847430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.901 [2024-06-11 12:04:34.847472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.847520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.901 [2024-06-11 12:04:34.847548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.847594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.901 [2024-06-11 12:04:34.847618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.847661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:21.901 [2024-06-11 12:04:34.847686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.847730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:10:21.901 [2024-06-11 12:04:34.847754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:21.901 #26 NEW cov: 11734 ft: 13819 corp: 18/315b lim: 25 exec/s: 26 rss: 70Mb L: 25/25 MS: 1 ChangeByte- 00:10:21.901 [2024-06-11 12:04:34.917636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:21.901 [2024-06-11 12:04:34.917678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.917726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:21.901 [2024-06-11 12:04:34.917754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.917800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:21.901 [2024-06-11 12:04:34.917824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.917867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:21.901 [2024-06-11 12:04:34.917892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:21.901 [2024-06-11 12:04:34.917935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:10:21.901 [2024-06-11 12:04:34.917959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:10:22.160 #27 NEW cov: 11734 ft: 13853 corp: 19/340b lim: 25 exec/s: 27 rss: 70Mb L: 25/25 MS: 1 ShuffleBytes- 00:10:22.160 [2024-06-11 12:04:34.987697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.160 [2024-06-11 12:04:34.987738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.160 [2024-06-11 12:04:34.987786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.160 [2024-06-11 12:04:34.987813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.160 [2024-06-11 12:04:34.987859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.160 [2024-06-11 12:04:34.987890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.160 [2024-06-11 12:04:34.987934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:22.160 [2024-06-11 12:04:34.987958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:22.160 #28 NEW cov: 11734 ft: 13909 corp: 20/362b lim: 25 exec/s: 28 rss: 70Mb L: 22/25 MS: 1 InsertRepeatedBytes- 00:10:22.160 [2024-06-11 12:04:35.077990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:10:22.160 [2024-06-11 12:04:35.078033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.160 [2024-06-11 12:04:35.078081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:10:22.160 [2024-06-11 12:04:35.078109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.160 [2024-06-11 12:04:35.078155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:10:22.161 [2024-06-11 12:04:35.078179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:22.161 [2024-06-11 12:04:35.078223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:10:22.161 [2024-06-11 12:04:35.078247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:22.161 #29 NEW cov: 11734 ft: 13948 corp: 21/385b lim: 25 exec/s: 14 rss: 70Mb L: 23/25 MS: 1 CopyPart- 00:10:22.161 #29 DONE cov: 11734 ft: 13948 corp: 21/385b lim: 25 exec/s: 14 rss: 70Mb 00:10:22.161 Done 29 runs in 2 second(s) 00:10:22.420 12:04:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:10:22.420 12:04:35 -- ../common.sh@72 -- # (( i++ )) 00:10:22.420 12:04:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:22.420 12:04:35 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:10:22.420 12:04:35 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:10:22.420 12:04:35 -- nvmf/run.sh@24 -- # local timen=1 00:10:22.420 12:04:35 -- nvmf/run.sh@25 -- # local core=0x1 00:10:22.420 12:04:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:10:22.420 12:04:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:10:22.420 12:04:35 -- nvmf/run.sh@29 -- # printf %02d 24 00:10:22.420 12:04:35 -- nvmf/run.sh@29 -- # port=4424 00:10:22.420 12:04:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:10:22.420 12:04:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:10:22.420 12:04:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:10:22.420 12:04:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:10:22.420 [2024-06-11 12:04:35.313802] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:22.420 [2024-06-11 12:04:35.313873] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699431 ] 00:10:22.420 EAL: No free 2048 kB hugepages reported on node 1 00:10:22.679 [2024-06-11 12:04:35.579655] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:22.679 [2024-06-11 12:04:35.606380] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:22.680 [2024-06-11 12:04:35.606559] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.680 [2024-06-11 12:04:35.661017] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:10:22.680 [2024-06-11 12:04:35.677258] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:10:22.680 INFO: Running with entropic power schedule (0xFF, 100). 00:10:22.680 INFO: Seed: 221470727 00:10:22.939 INFO: Loaded 1 modules (341565 inline 8-bit counters): 341565 [0x26b198c, 0x2704fc9), 00:10:22.939 INFO: Loaded 1 PC tables (341565 PCs): 341565 [0x2704fd0,0x2c3b3a0), 00:10:22.939 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:10:22.939 INFO: A corpus is not provided, starting from an empty corpus 00:10:22.939 #2 INITED exec/s: 0 rss: 61Mb 00:10:22.939 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:22.939 This may also happen if the target rejected all inputs we tried so far 00:10:22.939 [2024-06-11 12:04:35.726178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:22.939 [2024-06-11 12:04:35.726218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:22.939 [2024-06-11 12:04:35.726264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:22.939 [2024-06-11 12:04:35.726287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:22.939 [2024-06-11 12:04:35.726352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:22.939 [2024-06-11 12:04:35.726379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.198 NEW_FUNC[1/664]: 0x4c9ff0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:10:23.198 NEW_FUNC[2/664]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:10:23.198 #13 NEW cov: 11577 ft: 11578 corp: 2/73b lim: 100 exec/s: 0 rss: 68Mb L: 72/72 MS: 1 InsertRepeatedBytes- 00:10:23.198 [2024-06-11 12:04:36.057069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.198 [2024-06-11 12:04:36.057118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.198 [2024-06-11 12:04:36.057184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.198 [2024-06-11 12:04:36.057206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.198 [2024-06-11 12:04:36.057272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.198 [2024-06-11 12:04:36.057294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.198 NEW_FUNC[1/1]: 0x16eae20 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1437 00:10:23.198 #24 NEW cov: 11692 ft: 12152 corp: 3/145b lim: 100 exec/s: 0 rss: 68Mb L: 72/72 MS: 1 ChangeByte- 00:10:23.198 [2024-06-11 12:04:36.117170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952490408706048 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.198 [2024-06-11 12:04:36.117209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.198 [2024-06-11 12:04:36.117270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.117296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.199 [2024-06-11 12:04:36.117367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.117390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.199 #25 NEW cov: 11698 ft: 12292 corp: 4/221b lim: 100 exec/s: 0 rss: 68Mb L: 76/76 MS: 1 CMP- DE: "\001\000\000\000"- 00:10:23.199 [2024-06-11 12:04:36.167243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.167281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.199 [2024-06-11 12:04:36.167325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.167348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.199 [2024-06-11 12:04:36.167428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.167450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.199 #31 NEW cov: 11783 ft: 12652 corp: 5/293b lim: 100 exec/s: 0 rss: 69Mb L: 72/76 MS: 1 CopyPart- 00:10:23.199 [2024-06-11 12:04:36.227687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.227724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.199 [2024-06-11 12:04:36.227787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.227808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.199 [2024-06-11 12:04:36.227874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.227894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.199 [2024-06-11 12:04:36.227960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.199 [2024-06-11 12:04:36.227980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.458 #32 NEW cov: 11783 ft: 13062 corp: 6/374b lim: 100 exec/s: 0 rss: 69Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:10:23.458 [2024-06-11 12:04:36.287813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846943314785878851 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.287851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.287902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.287923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.287993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.288013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.288079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.288101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.459 #33 NEW cov: 11783 ft: 13127 corp: 7/455b lim: 100 exec/s: 0 rss: 69Mb L: 81/81 MS: 1 ChangeBinInt- 00:10:23.459 [2024-06-11 12:04:36.347976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.348013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.348072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.348094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.348160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.348181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.348249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.348270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.459 #34 NEW cov: 11783 ft: 13192 corp: 8/538b lim: 100 exec/s: 0 rss: 69Mb L: 83/83 MS: 1 CopyPart- 00:10:23.459 [2024-06-11 12:04:36.398146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.398184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.398255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.398277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.398343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.398367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.398434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.398455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.459 #35 NEW cov: 11783 ft: 13247 corp: 9/619b lim: 100 exec/s: 0 rss: 69Mb L: 81/83 MS: 1 ChangeBit- 00:10:23.459 [2024-06-11 12:04:36.447864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952490408706048 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.447900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.459 [2024-06-11 12:04:36.447960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.459 [2024-06-11 12:04:36.447982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.459 #36 NEW cov: 11783 ft: 13644 corp: 10/675b lim: 100 exec/s: 0 rss: 69Mb L: 56/83 MS: 1 EraseBytes- 00:10:23.719 [2024-06-11 12:04:36.508433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.508470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.508543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.508565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.508628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.508650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.508714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.508735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.719 #37 NEW cov: 11783 ft: 13672 corp: 11/773b lim: 100 exec/s: 0 rss: 69Mb L: 98/98 MS: 1 CrossOver- 00:10:23.719 [2024-06-11 12:04:36.568642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.568680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.568731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.568753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.568819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.568841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.568908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.568930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.719 #38 NEW cov: 11783 ft: 13688 corp: 12/854b lim: 100 exec/s: 0 rss: 69Mb L: 81/98 MS: 1 EraseBytes- 00:10:23.719 [2024-06-11 12:04:36.628775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.628812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.628869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.628891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.628963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.628985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.629049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.629072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.719 #39 NEW cov: 11783 ft: 13711 corp: 13/935b lim: 100 exec/s: 0 rss: 69Mb L: 81/98 MS: 1 CopyPart- 00:10:23.719 [2024-06-11 12:04:36.678981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.679020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.679068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.679090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.679155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.679178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.679244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.679265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.719 #40 NEW cov: 11783 ft: 13724 corp: 14/1019b lim: 100 exec/s: 0 rss: 69Mb L: 84/98 MS: 1 InsertByte- 00:10:23.719 [2024-06-11 12:04:36.728916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.728954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.729000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.729023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.719 [2024-06-11 12:04:36.729105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.719 [2024-06-11 12:04:36.729127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.979 #41 NEW cov: 11783 ft: 13745 corp: 15/1091b lim: 100 exec/s: 41 rss: 69Mb L: 72/98 MS: 1 ChangeBit- 00:10:23.979 [2024-06-11 12:04:36.779007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.979 [2024-06-11 12:04:36.779046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.779090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.779116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.779183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.779205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.980 #42 NEW cov: 11783 ft: 13761 corp: 16/1163b lim: 100 exec/s: 42 rss: 69Mb L: 72/98 MS: 1 CopyPart- 00:10:23.980 [2024-06-11 12:04:36.829323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.829368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.829435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.829457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.829524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.829545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.829611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.829633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.980 #43 NEW cov: 11783 ft: 13793 corp: 17/1244b lim: 100 exec/s: 43 rss: 69Mb L: 81/98 MS: 1 ShuffleBytes- 00:10:23.980 [2024-06-11 12:04:36.889348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.889392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.889447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.889470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.889539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:53124811261476864 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.889561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.980 #44 NEW cov: 11783 ft: 13801 corp: 18/1316b lim: 100 exec/s: 44 rss: 69Mb L: 72/98 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:10:23.980 [2024-06-11 12:04:36.929743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.929782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.929834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.929857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.929922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.929947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.930014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.930035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:23.980 #45 NEW cov: 11783 ft: 13827 corp: 19/1415b lim: 100 exec/s: 45 rss: 70Mb L: 99/99 MS: 1 InsertByte- 00:10:23.980 [2024-06-11 12:04:36.989587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.989625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.989675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.989698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:23.980 [2024-06-11 12:04:36.989776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:23.980 [2024-06-11 12:04:36.989798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.239 #46 NEW cov: 11783 ft: 13854 corp: 20/1487b lim: 100 exec/s: 46 rss: 70Mb L: 72/99 MS: 1 ShuffleBytes- 00:10:24.239 [2024-06-11 12:04:37.029938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4846943314785878851 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.029977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.030045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.030066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.030132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.030153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.030222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48139 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.030244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.239 #47 NEW cov: 11783 ft: 13894 corp: 21/1568b lim: 100 exec/s: 47 rss: 70Mb L: 81/99 MS: 1 ChangeByte- 00:10:24.239 [2024-06-11 12:04:37.090114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.090153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.090209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.090231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.090298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.090324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.090393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13546827682296937660 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.090416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.239 #48 NEW cov: 11783 ft: 13901 corp: 22/1666b lim: 100 exec/s: 48 rss: 70Mb L: 98/99 MS: 1 InsertRepeatedBytes- 00:10:24.239 [2024-06-11 12:04:37.140230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.140267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.140329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.140350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.140423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.140445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.239 [2024-06-11 12:04:37.140510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.140532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.239 #49 NEW cov: 11783 ft: 13926 corp: 23/1761b lim: 100 exec/s: 49 rss: 70Mb L: 95/99 MS: 1 CrossOver- 00:10:24.239 [2024-06-11 12:04:37.190399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.239 [2024-06-11 12:04:37.190435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.240 [2024-06-11 12:04:37.190511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.240 [2024-06-11 12:04:37.190532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.240 [2024-06-11 12:04:37.190598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16777216 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.240 [2024-06-11 12:04:37.190618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.240 [2024-06-11 12:04:37.190683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.240 [2024-06-11 12:04:37.190703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.240 #50 NEW cov: 11783 ft: 13940 corp: 24/1850b lim: 100 exec/s: 50 rss: 70Mb L: 89/99 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:10:24.240 [2024-06-11 12:04:37.240256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.240 [2024-06-11 12:04:37.240293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.240 [2024-06-11 12:04:37.240366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.240 [2024-06-11 12:04:37.240390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.499 #51 NEW cov: 11783 ft: 13971 corp: 25/1898b lim: 100 exec/s: 51 rss: 70Mb L: 48/99 MS: 1 EraseBytes- 00:10:24.499 [2024-06-11 12:04:37.300583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.300621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.300678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558382780 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.300699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.300768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.300790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.499 #52 NEW cov: 11783 ft: 14005 corp: 26/1970b lim: 100 exec/s: 52 rss: 70Mb L: 72/99 MS: 1 ChangeByte- 00:10:24.499 [2024-06-11 12:04:37.350918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599951899913403580 len:46269 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.350955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.351016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.351038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.351106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.351126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.351193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.351216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.499 #53 NEW cov: 11783 ft: 14018 corp: 27/2052b lim: 100 exec/s: 53 rss: 70Mb L: 82/99 MS: 1 InsertByte- 00:10:24.499 [2024-06-11 12:04:37.400507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.400544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.499 #54 NEW cov: 11783 ft: 14881 corp: 28/2088b lim: 100 exec/s: 54 rss: 70Mb L: 36/99 MS: 1 EraseBytes- 00:10:24.499 [2024-06-11 12:04:37.461038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414386 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.461075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.461124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.461149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.461217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.461238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.499 #59 NEW cov: 11783 ft: 14888 corp: 29/2163b lim: 100 exec/s: 59 rss: 70Mb L: 75/99 MS: 5 CMP-ChangeBinInt-ChangeBinInt-EraseBytes-CrossOver- DE: "\003\000\000\000"- 00:10:24.499 [2024-06-11 12:04:37.511147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.499 [2024-06-11 12:04:37.511183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.499 [2024-06-11 12:04:37.511243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.500 [2024-06-11 12:04:37.511265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.500 [2024-06-11 12:04:37.511331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.500 [2024-06-11 12:04:37.511353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.759 #60 NEW cov: 11783 ft: 14926 corp: 30/2235b lim: 100 exec/s: 60 rss: 70Mb L: 72/99 MS: 1 ChangeBit- 00:10:24.759 [2024-06-11 12:04:37.571343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599952493558414386 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.759 [2024-06-11 12:04:37.571385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.760 [2024-06-11 12:04:37.571448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.571470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.760 [2024-06-11 12:04:37.571538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.571561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.760 #61 NEW cov: 11783 ft: 14962 corp: 31/2310b lim: 100 exec/s: 61 rss: 70Mb L: 75/99 MS: 1 ChangeByte- 00:10:24.760 [2024-06-11 12:04:37.631703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599800760953781436 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.631741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.760 [2024-06-11 12:04:37.631815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.631837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.760 [2024-06-11 12:04:37.631904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.631926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:10:24.760 [2024-06-11 12:04:37.631998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.632020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:10:24.760 NEW_FUNC[1/1]: 0x19807e0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:24.760 #62 NEW cov: 11806 ft: 15019 corp: 32/2408b lim: 100 exec/s: 62 rss: 70Mb L: 98/99 MS: 1 CopyPart- 00:10:24.760 [2024-06-11 12:04:37.681551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13599951900852889276 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.681589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:10:24.760 [2024-06-11 12:04:37.681634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:10:24.760 [2024-06-11 12:04:37.681654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:10:24.760 #63 NEW cov: 11806 ft: 15026 corp: 33/2457b lim: 100 exec/s: 31 rss: 70Mb L: 49/99 MS: 1 InsertByte- 00:10:24.760 #63 DONE cov: 11806 ft: 15026 corp: 33/2457b lim: 100 exec/s: 31 rss: 70Mb 00:10:24.760 ###### Recommended dictionary. ###### 00:10:24.760 "\001\000\000\000" # Uses: 0 00:10:24.760 "\001\000\000\000\000\000\000\000" # Uses: 1 00:10:24.760 "\003\000\000\000" # Uses: 0 00:10:24.760 ###### End of recommended dictionary. ###### 00:10:24.760 Done 63 runs in 2 second(s) 00:10:25.020 12:04:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:10:25.020 12:04:37 -- ../common.sh@72 -- # (( i++ )) 00:10:25.020 12:04:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:25.020 12:04:37 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:10:25.020 00:10:25.020 real 1m5.056s 00:10:25.020 user 1m37.152s 00:10:25.020 sys 0m8.785s 00:10:25.020 12:04:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:25.020 12:04:37 -- common/autotest_common.sh@10 -- # set +x 00:10:25.020 ************************************ 00:10:25.020 END TEST nvmf_fuzz 00:10:25.020 ************************************ 00:10:25.020 12:04:37 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:10:25.020 12:04:37 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:10:25.020 12:04:37 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:10:25.020 12:04:37 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:10:25.020 12:04:37 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:10:25.020 12:04:37 -- common/autotest_common.sh@10 -- # set +x 00:10:25.020 ************************************ 00:10:25.020 START TEST vfio_fuzz 00:10:25.020 ************************************ 00:10:25.020 12:04:37 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:10:25.020 * Looking for test storage... 00:10:25.020 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:25.020 12:04:38 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:10:25.020 12:04:38 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:10:25.020 12:04:38 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:10:25.020 12:04:38 -- common/autotest_common.sh@34 -- # set -e 00:10:25.020 12:04:38 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:10:25.020 12:04:38 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:10:25.020 12:04:38 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:10:25.020 12:04:38 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:10:25.020 12:04:38 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:10:25.020 12:04:38 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:10:25.020 12:04:38 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:10:25.020 12:04:38 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:10:25.020 12:04:38 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:10:25.020 12:04:38 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:10:25.020 12:04:38 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:10:25.020 12:04:38 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:10:25.020 12:04:38 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:10:25.020 12:04:38 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:10:25.020 12:04:38 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:10:25.020 12:04:38 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:10:25.020 12:04:38 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:10:25.020 12:04:38 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:10:25.020 12:04:38 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:10:25.020 12:04:38 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:10:25.020 12:04:38 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:10:25.020 12:04:38 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:10:25.020 12:04:38 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:10:25.020 12:04:38 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:10:25.020 12:04:38 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:10:25.020 12:04:38 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:10:25.020 12:04:38 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:10:25.020 12:04:38 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:10:25.020 12:04:38 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:10:25.020 12:04:38 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:10:25.020 12:04:38 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:10:25.020 12:04:38 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:10:25.020 12:04:38 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:10:25.020 12:04:38 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:10:25.020 12:04:38 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:10:25.020 12:04:38 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:10:25.020 12:04:38 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:10:25.020 12:04:38 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:10:25.020 12:04:38 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:10:25.020 12:04:38 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:25.020 12:04:38 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:10:25.020 12:04:38 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:10:25.020 12:04:38 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:10:25.020 12:04:38 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:10:25.020 12:04:38 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:10:25.020 12:04:38 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:10:25.021 12:04:38 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:10:25.021 12:04:38 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:10:25.021 12:04:38 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:10:25.021 12:04:38 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:10:25.021 12:04:38 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:10:25.021 12:04:38 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:10:25.021 12:04:38 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:10:25.021 12:04:38 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:10:25.021 12:04:38 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:10:25.021 12:04:38 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:10:25.021 12:04:38 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:10:25.021 12:04:38 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:10:25.021 12:04:38 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:10:25.021 12:04:38 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:10:25.021 12:04:38 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:10:25.021 12:04:38 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:10:25.021 12:04:38 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:10:25.021 12:04:38 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:10:25.021 12:04:38 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:25.021 12:04:38 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:10:25.021 12:04:38 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:10:25.021 12:04:38 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:10:25.021 12:04:38 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:10:25.021 12:04:38 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:10:25.021 12:04:38 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:10:25.021 12:04:38 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:10:25.021 12:04:38 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:10:25.021 12:04:38 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:10:25.021 12:04:38 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:10:25.021 12:04:38 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:10:25.021 12:04:38 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:10:25.021 12:04:38 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:10:25.021 12:04:38 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:10:25.021 12:04:38 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:10:25.021 12:04:38 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:10:25.021 12:04:38 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:10:25.021 12:04:38 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:10:25.021 12:04:38 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:10:25.021 12:04:38 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:10:25.021 12:04:38 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:10:25.021 12:04:38 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:10:25.021 12:04:38 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:25.021 12:04:38 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:10:25.021 12:04:38 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:10:25.021 12:04:38 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:10:25.021 12:04:38 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:10:25.021 12:04:38 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:10:25.021 12:04:38 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:10:25.021 12:04:38 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:10:25.021 12:04:38 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:10:25.021 12:04:38 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:10:25.021 12:04:38 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:10:25.021 12:04:38 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:10:25.021 #define SPDK_CONFIG_H 00:10:25.021 #define SPDK_CONFIG_APPS 1 00:10:25.021 #define SPDK_CONFIG_ARCH native 00:10:25.021 #undef SPDK_CONFIG_ASAN 00:10:25.021 #undef SPDK_CONFIG_AVAHI 00:10:25.021 #undef SPDK_CONFIG_CET 00:10:25.021 #define SPDK_CONFIG_COVERAGE 1 00:10:25.021 #define SPDK_CONFIG_CROSS_PREFIX 00:10:25.021 #undef SPDK_CONFIG_CRYPTO 00:10:25.021 #undef SPDK_CONFIG_CRYPTO_MLX5 00:10:25.021 #undef SPDK_CONFIG_CUSTOMOCF 00:10:25.021 #undef SPDK_CONFIG_DAOS 00:10:25.021 #define SPDK_CONFIG_DAOS_DIR 00:10:25.021 #define SPDK_CONFIG_DEBUG 1 00:10:25.021 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:10:25.021 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:25.021 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:10:25.021 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:25.021 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:10:25.021 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:10:25.021 #define SPDK_CONFIG_EXAMPLES 1 00:10:25.021 #undef SPDK_CONFIG_FC 00:10:25.021 #define SPDK_CONFIG_FC_PATH 00:10:25.021 #define SPDK_CONFIG_FIO_PLUGIN 1 00:10:25.021 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:10:25.021 #undef SPDK_CONFIG_FUSE 00:10:25.021 #define SPDK_CONFIG_FUZZER 1 00:10:25.021 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:10:25.021 #undef SPDK_CONFIG_GOLANG 00:10:25.021 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:10:25.021 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:10:25.021 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:10:25.021 #undef SPDK_CONFIG_HAVE_LIBBSD 00:10:25.021 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:10:25.021 #define SPDK_CONFIG_IDXD 1 00:10:25.021 #define SPDK_CONFIG_IDXD_KERNEL 1 00:10:25.021 #undef SPDK_CONFIG_IPSEC_MB 00:10:25.021 #define SPDK_CONFIG_IPSEC_MB_DIR 00:10:25.021 #define SPDK_CONFIG_ISAL 1 00:10:25.021 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:10:25.021 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:10:25.021 #define SPDK_CONFIG_LIBDIR 00:10:25.021 #undef SPDK_CONFIG_LTO 00:10:25.021 #define SPDK_CONFIG_MAX_LCORES 00:10:25.021 #define SPDK_CONFIG_NVME_CUSE 1 00:10:25.021 #undef SPDK_CONFIG_OCF 00:10:25.021 #define SPDK_CONFIG_OCF_PATH 00:10:25.021 #define SPDK_CONFIG_OPENSSL_PATH 00:10:25.021 #undef SPDK_CONFIG_PGO_CAPTURE 00:10:25.021 #undef SPDK_CONFIG_PGO_USE 00:10:25.021 #define SPDK_CONFIG_PREFIX /usr/local 00:10:25.021 #undef SPDK_CONFIG_RAID5F 00:10:25.021 #undef SPDK_CONFIG_RBD 00:10:25.021 #define SPDK_CONFIG_RDMA 1 00:10:25.021 #define SPDK_CONFIG_RDMA_PROV verbs 00:10:25.021 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:10:25.021 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:10:25.021 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:10:25.021 #undef SPDK_CONFIG_SHARED 00:10:25.021 #undef SPDK_CONFIG_SMA 00:10:25.021 #define SPDK_CONFIG_TESTS 1 00:10:25.021 #undef SPDK_CONFIG_TSAN 00:10:25.021 #define SPDK_CONFIG_UBLK 1 00:10:25.021 #define SPDK_CONFIG_UBSAN 1 00:10:25.021 #undef SPDK_CONFIG_UNIT_TESTS 00:10:25.021 #undef SPDK_CONFIG_URING 00:10:25.021 #define SPDK_CONFIG_URING_PATH 00:10:25.021 #undef SPDK_CONFIG_URING_ZNS 00:10:25.021 #undef SPDK_CONFIG_USDT 00:10:25.021 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:10:25.021 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:10:25.021 #define SPDK_CONFIG_VFIO_USER 1 00:10:25.021 #define SPDK_CONFIG_VFIO_USER_DIR 00:10:25.021 #define SPDK_CONFIG_VHOST 1 00:10:25.021 #define SPDK_CONFIG_VIRTIO 1 00:10:25.021 #undef SPDK_CONFIG_VTUNE 00:10:25.021 #define SPDK_CONFIG_VTUNE_DIR 00:10:25.021 #define SPDK_CONFIG_WERROR 1 00:10:25.021 #define SPDK_CONFIG_WPDK_DIR 00:10:25.021 #undef SPDK_CONFIG_XNVME 00:10:25.021 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:10:25.021 12:04:38 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:10:25.021 12:04:38 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:25.021 12:04:38 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:10:25.021 12:04:38 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:25.021 12:04:38 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:25.021 12:04:38 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:25.021 12:04:38 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:25.021 12:04:38 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:25.021 12:04:38 -- paths/export.sh@5 -- # export PATH 00:10:25.021 12:04:38 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:25.022 12:04:38 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:10:25.022 12:04:38 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:10:25.283 12:04:38 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:10:25.283 12:04:38 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:10:25.283 12:04:38 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:10:25.283 12:04:38 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:25.283 12:04:38 -- pm/common@16 -- # TEST_TAG=N/A 00:10:25.283 12:04:38 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:10:25.283 12:04:38 -- common/autotest_common.sh@52 -- # : 1 00:10:25.283 12:04:38 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:10:25.283 12:04:38 -- common/autotest_common.sh@56 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:10:25.283 12:04:38 -- common/autotest_common.sh@58 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:10:25.283 12:04:38 -- common/autotest_common.sh@60 -- # : 1 00:10:25.283 12:04:38 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:10:25.283 12:04:38 -- common/autotest_common.sh@62 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:10:25.283 12:04:38 -- common/autotest_common.sh@64 -- # : 00:10:25.283 12:04:38 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:10:25.283 12:04:38 -- common/autotest_common.sh@66 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:10:25.283 12:04:38 -- common/autotest_common.sh@68 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:10:25.283 12:04:38 -- common/autotest_common.sh@70 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:10:25.283 12:04:38 -- common/autotest_common.sh@72 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:10:25.283 12:04:38 -- common/autotest_common.sh@74 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:10:25.283 12:04:38 -- common/autotest_common.sh@76 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:10:25.283 12:04:38 -- common/autotest_common.sh@78 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:10:25.283 12:04:38 -- common/autotest_common.sh@80 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:10:25.283 12:04:38 -- common/autotest_common.sh@82 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:10:25.283 12:04:38 -- common/autotest_common.sh@84 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:10:25.283 12:04:38 -- common/autotest_common.sh@86 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:10:25.283 12:04:38 -- common/autotest_common.sh@88 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:10:25.283 12:04:38 -- common/autotest_common.sh@90 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:10:25.283 12:04:38 -- common/autotest_common.sh@92 -- # : 1 00:10:25.283 12:04:38 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:10:25.283 12:04:38 -- common/autotest_common.sh@94 -- # : 1 00:10:25.283 12:04:38 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:10:25.283 12:04:38 -- common/autotest_common.sh@96 -- # : rdma 00:10:25.283 12:04:38 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:10:25.283 12:04:38 -- common/autotest_common.sh@98 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:10:25.283 12:04:38 -- common/autotest_common.sh@100 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:10:25.283 12:04:38 -- common/autotest_common.sh@102 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:10:25.283 12:04:38 -- common/autotest_common.sh@104 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:10:25.283 12:04:38 -- common/autotest_common.sh@106 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:10:25.283 12:04:38 -- common/autotest_common.sh@108 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:10:25.283 12:04:38 -- common/autotest_common.sh@110 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:10:25.283 12:04:38 -- common/autotest_common.sh@112 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:10:25.283 12:04:38 -- common/autotest_common.sh@114 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:10:25.283 12:04:38 -- common/autotest_common.sh@116 -- # : 1 00:10:25.283 12:04:38 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:10:25.283 12:04:38 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:10:25.283 12:04:38 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:10:25.283 12:04:38 -- common/autotest_common.sh@120 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:10:25.283 12:04:38 -- common/autotest_common.sh@122 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:10:25.283 12:04:38 -- common/autotest_common.sh@124 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:10:25.283 12:04:38 -- common/autotest_common.sh@126 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:10:25.283 12:04:38 -- common/autotest_common.sh@128 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:10:25.283 12:04:38 -- common/autotest_common.sh@130 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:10:25.283 12:04:38 -- common/autotest_common.sh@132 -- # : v23.11 00:10:25.283 12:04:38 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:10:25.283 12:04:38 -- common/autotest_common.sh@134 -- # : true 00:10:25.283 12:04:38 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:10:25.283 12:04:38 -- common/autotest_common.sh@136 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:10:25.283 12:04:38 -- common/autotest_common.sh@138 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:10:25.283 12:04:38 -- common/autotest_common.sh@140 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:10:25.283 12:04:38 -- common/autotest_common.sh@142 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:10:25.283 12:04:38 -- common/autotest_common.sh@144 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:10:25.283 12:04:38 -- common/autotest_common.sh@146 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:10:25.283 12:04:38 -- common/autotest_common.sh@148 -- # : 00:10:25.283 12:04:38 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:10:25.283 12:04:38 -- common/autotest_common.sh@150 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:10:25.283 12:04:38 -- common/autotest_common.sh@152 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:10:25.283 12:04:38 -- common/autotest_common.sh@154 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:10:25.283 12:04:38 -- common/autotest_common.sh@156 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:10:25.283 12:04:38 -- common/autotest_common.sh@158 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:10:25.283 12:04:38 -- common/autotest_common.sh@160 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:10:25.283 12:04:38 -- common/autotest_common.sh@163 -- # : 00:10:25.283 12:04:38 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:10:25.283 12:04:38 -- common/autotest_common.sh@165 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:10:25.283 12:04:38 -- common/autotest_common.sh@167 -- # : 0 00:10:25.283 12:04:38 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:10:25.283 12:04:38 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:10:25.283 12:04:38 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:10:25.283 12:04:38 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:25.283 12:04:38 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:10:25.283 12:04:38 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:25.283 12:04:38 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:25.284 12:04:38 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:25.284 12:04:38 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:10:25.284 12:04:38 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:10:25.284 12:04:38 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:10:25.284 12:04:38 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:10:25.284 12:04:38 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:10:25.284 12:04:38 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:10:25.284 12:04:38 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:10:25.284 12:04:38 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:10:25.284 12:04:38 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:10:25.284 12:04:38 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:10:25.284 12:04:38 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:10:25.284 12:04:38 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:10:25.284 12:04:38 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:10:25.284 12:04:38 -- common/autotest_common.sh@196 -- # cat 00:10:25.284 12:04:38 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:10:25.284 12:04:38 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:10:25.284 12:04:38 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:10:25.284 12:04:38 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:10:25.284 12:04:38 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:10:25.284 12:04:38 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:10:25.284 12:04:38 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:10:25.284 12:04:38 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:10:25.284 12:04:38 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:10:25.284 12:04:38 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:10:25.284 12:04:38 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:10:25.284 12:04:38 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:10:25.284 12:04:38 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:10:25.284 12:04:38 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:10:25.284 12:04:38 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:10:25.284 12:04:38 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:10:25.284 12:04:38 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:10:25.284 12:04:38 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:10:25.284 12:04:38 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:10:25.284 12:04:38 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:10:25.284 12:04:38 -- common/autotest_common.sh@249 -- # export valgrind= 00:10:25.284 12:04:38 -- common/autotest_common.sh@249 -- # valgrind= 00:10:25.284 12:04:38 -- common/autotest_common.sh@255 -- # uname -s 00:10:25.284 12:04:38 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:10:25.284 12:04:38 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:10:25.284 12:04:38 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:10:25.284 12:04:38 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:10:25.284 12:04:38 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:10:25.284 12:04:38 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:10:25.284 12:04:38 -- common/autotest_common.sh@265 -- # MAKE=make 00:10:25.284 12:04:38 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j72 00:10:25.284 12:04:38 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:10:25.284 12:04:38 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:10:25.284 12:04:38 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:10:25.284 12:04:38 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:10:25.284 12:04:38 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:10:25.284 12:04:38 -- common/autotest_common.sh@309 -- # [[ -z 2699821 ]] 00:10:25.284 12:04:38 -- common/autotest_common.sh@309 -- # kill -0 2699821 00:10:25.284 12:04:38 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:10:25.284 12:04:38 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:10:25.284 12:04:38 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:10:25.284 12:04:38 -- common/autotest_common.sh@322 -- # local mount target_dir 00:10:25.284 12:04:38 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:10:25.284 12:04:38 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:10:25.284 12:04:38 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:10:25.284 12:04:38 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:10:25.284 12:04:38 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.4JeOuX 00:10:25.284 12:04:38 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:10:25.284 12:04:38 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:10:25.284 12:04:38 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:10:25.284 12:04:38 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.4JeOuX/tests/vfio /tmp/spdk.4JeOuX 00:10:25.284 12:04:38 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@318 -- # df -T 00:10:25.284 12:04:38 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=902909952 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=4381519872 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=80792150016 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=94508556288 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=13716406272 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=47251685376 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254278144 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=18895622144 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=18901712896 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=6090752 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=47253745664 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=47254278144 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=532480 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # avails["$mount"]=9450848256 00:10:25.284 12:04:38 -- common/autotest_common.sh@353 -- # sizes["$mount"]=9450852352 00:10:25.284 12:04:38 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:10:25.284 12:04:38 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:10:25.284 12:04:38 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:10:25.285 * Looking for test storage... 00:10:25.285 12:04:38 -- common/autotest_common.sh@359 -- # local target_space new_size 00:10:25.285 12:04:38 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:10:25.285 12:04:38 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:25.285 12:04:38 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:10:25.285 12:04:38 -- common/autotest_common.sh@363 -- # mount=/ 00:10:25.285 12:04:38 -- common/autotest_common.sh@365 -- # target_space=80792150016 00:10:25.285 12:04:38 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:10:25.285 12:04:38 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:10:25.285 12:04:38 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:10:25.285 12:04:38 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:10:25.285 12:04:38 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:10:25.285 12:04:38 -- common/autotest_common.sh@372 -- # new_size=15930998784 00:10:25.285 12:04:38 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:10:25.285 12:04:38 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:25.285 12:04:38 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:25.285 12:04:38 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:25.285 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:10:25.285 12:04:38 -- common/autotest_common.sh@380 -- # return 0 00:10:25.285 12:04:38 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:10:25.285 12:04:38 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:10:25.285 12:04:38 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:10:25.285 12:04:38 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:10:25.285 12:04:38 -- common/autotest_common.sh@1672 -- # true 00:10:25.285 12:04:38 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:10:25.285 12:04:38 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:10:25.285 12:04:38 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:10:25.285 12:04:38 -- common/autotest_common.sh@27 -- # exec 00:10:25.285 12:04:38 -- common/autotest_common.sh@29 -- # exec 00:10:25.285 12:04:38 -- common/autotest_common.sh@31 -- # xtrace_restore 00:10:25.285 12:04:38 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:10:25.285 12:04:38 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:10:25.285 12:04:38 -- common/autotest_common.sh@18 -- # set -x 00:10:25.285 12:04:38 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:10:25.285 12:04:38 -- ../common.sh@8 -- # pids=() 00:10:25.285 12:04:38 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:10:25.285 12:04:38 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:10:25.285 12:04:38 -- vfio/run.sh@59 -- # fuzz_num=7 00:10:25.285 12:04:38 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:10:25.285 12:04:38 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:10:25.285 12:04:38 -- vfio/run.sh@65 -- # mem_size=0 00:10:25.285 12:04:38 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:10:25.285 12:04:38 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:10:25.285 12:04:38 -- ../common.sh@69 -- # local fuzz_num=7 00:10:25.285 12:04:38 -- ../common.sh@70 -- # local time=1 00:10:25.285 12:04:38 -- ../common.sh@72 -- # (( i = 0 )) 00:10:25.285 12:04:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:25.285 12:04:38 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:10:25.285 12:04:38 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:10:25.285 12:04:38 -- vfio/run.sh@23 -- # local timen=1 00:10:25.285 12:04:38 -- vfio/run.sh@24 -- # local core=0x1 00:10:25.285 12:04:38 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:10:25.285 12:04:38 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:10:25.285 12:04:38 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:10:25.285 12:04:38 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:10:25.285 12:04:38 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:10:25.285 12:04:38 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:10:25.285 12:04:38 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:10:25.285 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:25.285 12:04:38 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:10:25.285 [2024-06-11 12:04:38.221138] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:25.285 [2024-06-11 12:04:38.221235] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699946 ] 00:10:25.285 EAL: No free 2048 kB hugepages reported on node 1 00:10:25.544 [2024-06-11 12:04:38.353328] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:25.544 [2024-06-11 12:04:38.403357] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:25.544 [2024-06-11 12:04:38.403584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:25.803 INFO: Running with entropic power schedule (0xFF, 100). 00:10:25.803 INFO: Seed: 3144476163 00:10:25.803 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:25.803 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:25.803 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:10:25.803 INFO: A corpus is not provided, starting from an empty corpus 00:10:25.803 #2 INITED exec/s: 0 rss: 61Mb 00:10:25.803 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:25.803 This may also happen if the target rejected all inputs we tried so far 00:10:26.320 NEW_FUNC[1/618]: 0x49e0e0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:10:26.320 NEW_FUNC[2/618]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:26.320 #13 NEW cov: 10641 ft: 10656 corp: 2/40b lim: 60 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:10:26.579 NEW_FUNC[1/4]: 0x10f0140 in nvmf_ns_reservation_request_check /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3996 00:10:26.579 NEW_FUNC[2/4]: 0x10f5ad0 in spdk_nvmf_request_using_zcopy /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/nvmf_transport.h:502 00:10:26.579 #14 NEW cov: 10738 ft: 14242 corp: 3/71b lim: 60 exec/s: 0 rss: 69Mb L: 31/39 MS: 1 EraseBytes- 00:10:26.838 #15 NEW cov: 10738 ft: 14783 corp: 4/102b lim: 60 exec/s: 15 rss: 70Mb L: 31/39 MS: 1 CrossOver- 00:10:27.097 #17 NEW cov: 10748 ft: 15497 corp: 5/153b lim: 60 exec/s: 17 rss: 70Mb L: 51/51 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:10:27.097 #18 NEW cov: 10748 ft: 15579 corp: 6/211b lim: 60 exec/s: 18 rss: 70Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:10:27.355 #19 NEW cov: 10748 ft: 15741 corp: 7/249b lim: 60 exec/s: 19 rss: 70Mb L: 38/58 MS: 1 EraseBytes- 00:10:27.614 #20 NEW cov: 10755 ft: 15811 corp: 8/307b lim: 60 exec/s: 20 rss: 70Mb L: 58/58 MS: 1 ChangeByte- 00:10:27.873 #26 NEW cov: 10755 ft: 16000 corp: 9/345b lim: 60 exec/s: 13 rss: 70Mb L: 38/58 MS: 1 ShuffleBytes- 00:10:27.873 #26 DONE cov: 10755 ft: 16000 corp: 9/345b lim: 60 exec/s: 13 rss: 70Mb 00:10:27.873 Done 26 runs in 2 second(s) 00:10:28.133 12:04:41 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:10:28.133 12:04:41 -- ../common.sh@72 -- # (( i++ )) 00:10:28.133 12:04:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:28.133 12:04:41 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:10:28.133 12:04:41 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:10:28.133 12:04:41 -- vfio/run.sh@23 -- # local timen=1 00:10:28.133 12:04:41 -- vfio/run.sh@24 -- # local core=0x1 00:10:28.133 12:04:41 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:28.133 12:04:41 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:10:28.133 12:04:41 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:10:28.133 12:04:41 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:10:28.133 12:04:41 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:10:28.133 12:04:41 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:28.133 12:04:41 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:10:28.133 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:28.133 12:04:41 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:10:28.393 [2024-06-11 12:04:41.176312] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:28.393 [2024-06-11 12:04:41.176422] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2700388 ] 00:10:28.393 EAL: No free 2048 kB hugepages reported on node 1 00:10:28.393 [2024-06-11 12:04:41.306754] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:28.394 [2024-06-11 12:04:41.352695] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:28.394 [2024-06-11 12:04:41.352904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:28.653 INFO: Running with entropic power schedule (0xFF, 100). 00:10:28.653 INFO: Seed: 1781512633 00:10:28.653 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:28.653 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:28.653 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:28.653 INFO: A corpus is not provided, starting from an empty corpus 00:10:28.653 #2 INITED exec/s: 0 rss: 61Mb 00:10:28.653 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:28.653 This may also happen if the target rejected all inputs we tried so far 00:10:28.653 [2024-06-11 12:04:41.649449] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:28.653 [2024-06-11 12:04:41.649491] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:28.653 [2024-06-11 12:04:41.649524] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:29.481 NEW_FUNC[1/628]: 0x49e680 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:10:29.481 NEW_FUNC[2/628]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:29.481 #5 NEW cov: 10722 ft: 10691 corp: 2/15b lim: 40 exec/s: 0 rss: 68Mb L: 14/14 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:10:29.481 [2024-06-11 12:04:42.260327] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:29.481 [2024-06-11 12:04:42.260386] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:29.481 [2024-06-11 12:04:42.260420] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:29.481 NEW_FUNC[1/1]: 0x194cf80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:29.481 #6 NEW cov: 10753 ft: 13312 corp: 3/21b lim: 40 exec/s: 0 rss: 69Mb L: 6/14 MS: 1 InsertRepeatedBytes- 00:10:29.481 [2024-06-11 12:04:42.444197] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:29.481 [2024-06-11 12:04:42.444237] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:29.481 [2024-06-11 12:04:42.444265] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:29.740 #7 NEW cov: 10753 ft: 14630 corp: 4/27b lim: 40 exec/s: 0 rss: 70Mb L: 6/14 MS: 1 ChangeByte- 00:10:29.740 [2024-06-11 12:04:42.606452] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:29.740 [2024-06-11 12:04:42.606489] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:29.740 [2024-06-11 12:04:42.606518] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:29.740 #8 NEW cov: 10753 ft: 15112 corp: 5/32b lim: 40 exec/s: 8 rss: 70Mb L: 5/14 MS: 1 EraseBytes- 00:10:29.740 [2024-06-11 12:04:42.758764] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:29.740 [2024-06-11 12:04:42.758800] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:29.740 [2024-06-11 12:04:42.758829] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:29.999 #9 NEW cov: 10753 ft: 15388 corp: 6/38b lim: 40 exec/s: 9 rss: 70Mb L: 6/14 MS: 1 ChangeBit- 00:10:29.999 [2024-06-11 12:04:42.911116] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:29.999 [2024-06-11 12:04:42.911151] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:29.999 [2024-06-11 12:04:42.911180] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:29.999 #10 NEW cov: 10753 ft: 15763 corp: 7/42b lim: 40 exec/s: 10 rss: 70Mb L: 4/14 MS: 1 EraseBytes- 00:10:30.258 [2024-06-11 12:04:43.073250] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:30.258 [2024-06-11 12:04:43.073286] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:30.258 [2024-06-11 12:04:43.073314] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.258 #11 NEW cov: 10753 ft: 15860 corp: 8/81b lim: 40 exec/s: 11 rss: 70Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:10:30.258 [2024-06-11 12:04:43.235464] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:30.258 [2024-06-11 12:04:43.235502] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:30.258 [2024-06-11 12:04:43.235530] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.517 #12 NEW cov: 10753 ft: 16397 corp: 9/93b lim: 40 exec/s: 12 rss: 70Mb L: 12/39 MS: 1 CopyPart- 00:10:30.517 [2024-06-11 12:04:43.397861] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:30.517 [2024-06-11 12:04:43.397897] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:30.517 [2024-06-11 12:04:43.397924] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.517 #14 NEW cov: 10760 ft: 16502 corp: 10/127b lim: 40 exec/s: 14 rss: 70Mb L: 34/39 MS: 2 CopyPart-InsertRepeatedBytes- 00:10:30.777 [2024-06-11 12:04:43.550213] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:30.777 [2024-06-11 12:04:43.550249] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:30.777 [2024-06-11 12:04:43.550278] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:30.777 #15 NEW cov: 10760 ft: 16641 corp: 11/132b lim: 40 exec/s: 7 rss: 70Mb L: 5/39 MS: 1 ChangeBit- 00:10:30.777 #15 DONE cov: 10760 ft: 16641 corp: 11/132b lim: 40 exec/s: 7 rss: 70Mb 00:10:30.777 Done 15 runs in 2 second(s) 00:10:31.036 12:04:43 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:10:31.036 12:04:43 -- ../common.sh@72 -- # (( i++ )) 00:10:31.036 12:04:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:31.036 12:04:43 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:10:31.036 12:04:43 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:10:31.036 12:04:43 -- vfio/run.sh@23 -- # local timen=1 00:10:31.036 12:04:43 -- vfio/run.sh@24 -- # local core=0x1 00:10:31.036 12:04:43 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:31.036 12:04:43 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:10:31.037 12:04:43 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:10:31.037 12:04:43 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:10:31.037 12:04:43 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:10:31.037 12:04:43 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:31.037 12:04:43 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:10:31.037 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:31.037 12:04:43 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:10:31.037 [2024-06-11 12:04:44.026401] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:31.037 [2024-06-11 12:04:44.026486] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2700759 ] 00:10:31.296 EAL: No free 2048 kB hugepages reported on node 1 00:10:31.296 [2024-06-11 12:04:44.156311] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:31.296 [2024-06-11 12:04:44.209156] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:31.296 [2024-06-11 12:04:44.209377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:31.555 INFO: Running with entropic power schedule (0xFF, 100). 00:10:31.555 INFO: Seed: 348552603 00:10:31.555 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:31.555 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:31.555 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:31.555 INFO: A corpus is not provided, starting from an empty corpus 00:10:31.555 #2 INITED exec/s: 0 rss: 61Mb 00:10:31.555 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:31.555 This may also happen if the target rejected all inputs we tried so far 00:10:31.555 [2024-06-11 12:04:44.543164] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:32.382 NEW_FUNC[1/625]: 0x49f060 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:10:32.382 NEW_FUNC[2/625]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:32.382 #12 NEW cov: 10689 ft: 10670 corp: 2/30b lim: 80 exec/s: 0 rss: 68Mb L: 29/29 MS: 5 CrossOver-InsertByte-CrossOver-CopyPart-InsertRepeatedBytes- 00:10:32.382 [2024-06-11 12:04:45.214522] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:32.382 NEW_FUNC[1/2]: 0x10e8860 in _nvmf_subsystem_get_ns /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/./nvmf_internal.h:459 00:10:32.382 NEW_FUNC[2/2]: 0x194cf80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:32.382 #13 NEW cov: 10733 ft: 13297 corp: 3/59b lim: 80 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ChangeByte- 00:10:32.640 [2024-06-11 12:04:45.465623] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:10:32.640 [2024-06-11 12:04:45.465675] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:10:32.640 NEW_FUNC[1/2]: 0x134fcc0 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:10:32.640 NEW_FUNC[2/2]: 0x134ff50 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:10:32.640 #14 NEW cov: 10746 ft: 14276 corp: 4/117b lim: 80 exec/s: 14 rss: 70Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:10:32.898 [2024-06-11 12:04:45.723886] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:32.898 #15 NEW cov: 10746 ft: 14511 corp: 5/146b lim: 80 exec/s: 15 rss: 70Mb L: 29/58 MS: 1 ChangeByte- 00:10:33.157 [2024-06-11 12:04:45.953366] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:33.157 #21 NEW cov: 10746 ft: 15317 corp: 6/175b lim: 80 exec/s: 21 rss: 70Mb L: 29/58 MS: 1 CopyPart- 00:10:33.157 [2024-06-11 12:04:46.187539] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:10:33.157 [2024-06-11 12:04:46.187580] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:10:33.416 #22 NEW cov: 10753 ft: 15828 corp: 7/234b lim: 80 exec/s: 22 rss: 70Mb L: 59/59 MS: 1 InsertByte- 00:10:33.416 [2024-06-11 12:04:46.428629] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:33.675 #23 NEW cov: 10753 ft: 15989 corp: 8/263b lim: 80 exec/s: 11 rss: 70Mb L: 29/59 MS: 1 ChangeBit- 00:10:33.675 #23 DONE cov: 10753 ft: 15989 corp: 8/263b lim: 80 exec/s: 11 rss: 70Mb 00:10:33.675 Done 23 runs in 2 second(s) 00:10:33.934 12:04:46 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:10:33.935 12:04:46 -- ../common.sh@72 -- # (( i++ )) 00:10:33.935 12:04:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:33.935 12:04:46 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:10:33.935 12:04:46 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:10:33.935 12:04:46 -- vfio/run.sh@23 -- # local timen=1 00:10:33.935 12:04:46 -- vfio/run.sh@24 -- # local core=0x1 00:10:33.935 12:04:46 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:33.935 12:04:46 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:10:33.935 12:04:46 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:10:33.935 12:04:46 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:10:33.935 12:04:46 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:10:33.935 12:04:46 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:33.935 12:04:46 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:10:33.935 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:33.935 12:04:46 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:10:33.935 [2024-06-11 12:04:46.938074] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:33.935 [2024-06-11 12:04:46.938161] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701130 ] 00:10:34.194 EAL: No free 2048 kB hugepages reported on node 1 00:10:34.194 [2024-06-11 12:04:47.066055] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.194 [2024-06-11 12:04:47.114300] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:34.194 [2024-06-11 12:04:47.114512] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.452 INFO: Running with entropic power schedule (0xFF, 100). 00:10:34.452 INFO: Seed: 3245550076 00:10:34.452 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:34.452 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:34.452 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:34.452 INFO: A corpus is not provided, starting from an empty corpus 00:10:34.452 #2 INITED exec/s: 0 rss: 62Mb 00:10:34.452 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:34.452 This may also happen if the target rejected all inputs we tried so far 00:10:35.277 NEW_FUNC[1/622]: 0x49f740 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:10:35.277 NEW_FUNC[2/622]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:35.277 #19 NEW cov: 10689 ft: 10658 corp: 2/77b lim: 320 exec/s: 0 rss: 68Mb L: 76/76 MS: 2 CopyPart-InsertRepeatedBytes- 00:10:35.277 [2024-06-11 12:04:48.113290] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:10:35.277 [2024-06-11 12:04:48.113342] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:35.277 [2024-06-11 12:04:48.113366] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:35.277 [2024-06-11 12:04:48.113583] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:35.277 NEW_FUNC[1/7]: 0x134fcc0 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:10:35.277 NEW_FUNC[2/7]: 0x134ff50 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:10:35.277 #22 NEW cov: 10755 ft: 12974 corp: 3/111b lim: 320 exec/s: 0 rss: 69Mb L: 34/76 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:10:35.536 #23 NEW cov: 10755 ft: 13851 corp: 4/187b lim: 320 exec/s: 23 rss: 70Mb L: 76/76 MS: 1 ChangeBinInt- 00:10:35.795 [2024-06-11 12:04:48.575053] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:10:35.795 [2024-06-11 12:04:48.575088] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:35.795 [2024-06-11 12:04:48.575104] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:35.795 [2024-06-11 12:04:48.575130] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:35.795 #29 NEW cov: 10755 ft: 15412 corp: 5/222b lim: 320 exec/s: 29 rss: 70Mb L: 35/76 MS: 1 InsertByte- 00:10:36.053 #35 NEW cov: 10755 ft: 15981 corp: 6/298b lim: 320 exec/s: 35 rss: 70Mb L: 76/76 MS: 1 ShuffleBytes- 00:10:36.312 #36 NEW cov: 10762 ft: 16033 corp: 7/374b lim: 320 exec/s: 36 rss: 70Mb L: 76/76 MS: 1 ShuffleBytes- 00:10:36.570 #37 NEW cov: 10762 ft: 16454 corp: 8/570b lim: 320 exec/s: 18 rss: 70Mb L: 196/196 MS: 1 InsertRepeatedBytes- 00:10:36.570 #37 DONE cov: 10762 ft: 16454 corp: 8/570b lim: 320 exec/s: 18 rss: 70Mb 00:10:36.570 Done 37 runs in 2 second(s) 00:10:36.829 12:04:49 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:10:36.829 12:04:49 -- ../common.sh@72 -- # (( i++ )) 00:10:36.829 12:04:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:36.829 12:04:49 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:10:36.829 12:04:49 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:10:36.829 12:04:49 -- vfio/run.sh@23 -- # local timen=1 00:10:36.829 12:04:49 -- vfio/run.sh@24 -- # local core=0x1 00:10:36.829 12:04:49 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:36.829 12:04:49 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:10:36.829 12:04:49 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:10:36.829 12:04:49 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:10:36.829 12:04:49 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:10:36.829 12:04:49 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:36.829 12:04:49 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:10:36.829 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:36.829 12:04:49 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:10:36.830 [2024-06-11 12:04:49.738506] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:36.830 [2024-06-11 12:04:49.738610] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701498 ] 00:10:36.830 EAL: No free 2048 kB hugepages reported on node 1 00:10:37.088 [2024-06-11 12:04:49.869992] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:37.088 [2024-06-11 12:04:49.917808] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:37.088 [2024-06-11 12:04:49.918013] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.088 INFO: Running with entropic power schedule (0xFF, 100). 00:10:37.088 INFO: Seed: 1749577242 00:10:37.347 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:37.347 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:37.347 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:37.347 INFO: A corpus is not provided, starting from an empty corpus 00:10:37.347 #2 INITED exec/s: 0 rss: 61Mb 00:10:37.347 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:37.347 This may also happen if the target rejected all inputs we tried so far 00:10:37.914 NEW_FUNC[1/622]: 0x49ffc0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:10:37.914 NEW_FUNC[2/622]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:37.914 #10 NEW cov: 10696 ft: 10323 corp: 2/131b lim: 320 exec/s: 0 rss: 68Mb L: 130/130 MS: 3 ChangeBit-CMP-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377\377"- 00:10:37.914 #11 NEW cov: 10710 ft: 13088 corp: 3/229b lim: 320 exec/s: 0 rss: 69Mb L: 98/130 MS: 1 InsertRepeatedBytes- 00:10:38.173 NEW_FUNC[1/1]: 0x194cf80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:38.173 #12 NEW cov: 10727 ft: 14291 corp: 4/367b lim: 320 exec/s: 0 rss: 70Mb L: 138/138 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:10:38.431 #13 NEW cov: 10727 ft: 15737 corp: 5/497b lim: 320 exec/s: 13 rss: 70Mb L: 130/138 MS: 1 ChangeBit- 00:10:38.431 #14 NEW cov: 10727 ft: 16738 corp: 6/605b lim: 320 exec/s: 14 rss: 70Mb L: 108/138 MS: 1 InsertRepeatedBytes- 00:10:38.689 #15 NEW cov: 10727 ft: 17305 corp: 7/733b lim: 320 exec/s: 15 rss: 70Mb L: 128/138 MS: 1 CopyPart- 00:10:38.947 #16 NEW cov: 10727 ft: 17456 corp: 8/797b lim: 320 exec/s: 16 rss: 70Mb L: 64/138 MS: 1 InsertRepeatedBytes- 00:10:38.947 #17 NEW cov: 10727 ft: 17786 corp: 9/861b lim: 320 exec/s: 17 rss: 70Mb L: 64/138 MS: 1 CrossOver- 00:10:39.206 #18 NEW cov: 10734 ft: 17861 corp: 10/967b lim: 320 exec/s: 18 rss: 70Mb L: 106/138 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:10:39.466 #19 NEW cov: 10734 ft: 18205 corp: 11/1039b lim: 320 exec/s: 9 rss: 70Mb L: 72/138 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:10:39.466 #19 DONE cov: 10734 ft: 18205 corp: 11/1039b lim: 320 exec/s: 9 rss: 70Mb 00:10:39.466 ###### Recommended dictionary. ###### 00:10:39.466 "\377\377\377\377\377\377\377\377" # Uses: 3 00:10:39.466 ###### End of recommended dictionary. ###### 00:10:39.466 Done 19 runs in 2 second(s) 00:10:39.760 12:04:52 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:10:39.760 12:04:52 -- ../common.sh@72 -- # (( i++ )) 00:10:39.760 12:04:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:39.760 12:04:52 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:10:39.760 12:04:52 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:10:39.760 12:04:52 -- vfio/run.sh@23 -- # local timen=1 00:10:39.760 12:04:52 -- vfio/run.sh@24 -- # local core=0x1 00:10:39.760 12:04:52 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:39.760 12:04:52 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:10:39.760 12:04:52 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:10:39.760 12:04:52 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:10:39.760 12:04:52 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:10:39.760 12:04:52 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:39.760 12:04:52 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:10:39.761 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:39.761 12:04:52 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:10:39.761 [2024-06-11 12:04:52.607274] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:39.761 [2024-06-11 12:04:52.607367] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701869 ] 00:10:39.761 EAL: No free 2048 kB hugepages reported on node 1 00:10:39.761 [2024-06-11 12:04:52.737251] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:39.761 [2024-06-11 12:04:52.787658] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:39.761 [2024-06-11 12:04:52.787865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.077 INFO: Running with entropic power schedule (0xFF, 100). 00:10:40.077 INFO: Seed: 327644438 00:10:40.077 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:40.077 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:40.077 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:40.077 INFO: A corpus is not provided, starting from an empty corpus 00:10:40.077 #2 INITED exec/s: 0 rss: 61Mb 00:10:40.077 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:40.077 This may also happen if the target rejected all inputs we tried so far 00:10:40.077 [2024-06-11 12:04:53.101399] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:40.077 [2024-06-11 12:04:53.101461] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:40.902 NEW_FUNC[1/624]: 0x4a09c0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:10:40.902 NEW_FUNC[2/624]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:40.902 #8 NEW cov: 10590 ft: 10662 corp: 2/68b lim: 120 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 InsertRepeatedBytes- 00:10:40.902 [2024-06-11 12:04:53.774744] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:40.902 [2024-06-11 12:04:53.774811] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:40.902 NEW_FUNC[1/5]: 0x16592b0 in nvme_pcie_qpair_submit_tracker /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_pcie_common.c:622 00:10:40.902 NEW_FUNC[2/5]: 0x165c350 in nvme_pcie_copy_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_pcie_common.c:605 00:10:40.902 #9 NEW cov: 10755 ft: 13871 corp: 3/132b lim: 120 exec/s: 0 rss: 69Mb L: 64/67 MS: 1 EraseBytes- 00:10:41.161 [2024-06-11 12:04:54.028912] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:41.161 [2024-06-11 12:04:54.028955] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:41.161 #10 NEW cov: 10755 ft: 14321 corp: 4/248b lim: 120 exec/s: 10 rss: 70Mb L: 116/116 MS: 1 InsertRepeatedBytes- 00:10:41.420 [2024-06-11 12:04:54.272375] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:41.420 [2024-06-11 12:04:54.272417] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:41.420 #11 NEW cov: 10755 ft: 14600 corp: 5/312b lim: 120 exec/s: 11 rss: 70Mb L: 64/116 MS: 1 CopyPart- 00:10:41.679 [2024-06-11 12:04:54.518820] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:41.679 [2024-06-11 12:04:54.518861] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:41.679 #12 NEW cov: 10755 ft: 14722 corp: 6/428b lim: 120 exec/s: 12 rss: 70Mb L: 116/116 MS: 1 ChangeByte- 00:10:41.938 [2024-06-11 12:04:54.769985] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:41.938 [2024-06-11 12:04:54.770025] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:41.938 #13 NEW cov: 10762 ft: 15058 corp: 7/494b lim: 120 exec/s: 13 rss: 70Mb L: 66/116 MS: 1 InsertRepeatedBytes- 00:10:42.197 [2024-06-11 12:04:55.026322] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:42.197 [2024-06-11 12:04:55.026365] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:42.197 #14 NEW cov: 10762 ft: 15312 corp: 8/558b lim: 120 exec/s: 7 rss: 70Mb L: 64/116 MS: 1 ShuffleBytes- 00:10:42.197 #14 DONE cov: 10762 ft: 15312 corp: 8/558b lim: 120 exec/s: 7 rss: 70Mb 00:10:42.197 Done 14 runs in 2 second(s) 00:10:42.766 12:04:55 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:10:42.766 12:04:55 -- ../common.sh@72 -- # (( i++ )) 00:10:42.766 12:04:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:42.766 12:04:55 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:10:42.766 12:04:55 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:10:42.766 12:04:55 -- vfio/run.sh@23 -- # local timen=1 00:10:42.766 12:04:55 -- vfio/run.sh@24 -- # local core=0x1 00:10:42.766 12:04:55 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:42.766 12:04:55 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:10:42.766 12:04:55 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:10:42.766 12:04:55 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:10:42.766 12:04:55 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:10:42.766 12:04:55 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:42.766 12:04:55 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:10:42.766 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:42.766 12:04:55 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:10:42.766 [2024-06-11 12:04:55.524946] Starting SPDK v24.01.1-pre git sha1 130b9406a / DPDK 23.11.0 initialization... 00:10:42.766 [2024-06-11 12:04:55.525022] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2702245 ] 00:10:42.766 EAL: No free 2048 kB hugepages reported on node 1 00:10:42.766 [2024-06-11 12:04:55.652311] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:42.766 [2024-06-11 12:04:55.696635] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:10:42.766 [2024-06-11 12:04:55.696839] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:43.026 INFO: Running with entropic power schedule (0xFF, 100). 00:10:43.026 INFO: Seed: 3246608711 00:10:43.026 INFO: Loaded 1 modules (338807 inline 8-bit counters): 338807 [0x267420c, 0x26c6d83), 00:10:43.026 INFO: Loaded 1 PC tables (338807 PCs): 338807 [0x26c6d88,0x2bf24f8), 00:10:43.026 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:43.026 INFO: A corpus is not provided, starting from an empty corpus 00:10:43.026 #2 INITED exec/s: 0 rss: 61Mb 00:10:43.026 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:43.026 This may also happen if the target rejected all inputs we tried so far 00:10:43.026 [2024-06-11 12:04:56.022402] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:43.026 [2024-06-11 12:04:56.022456] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:43.545 NEW_FUNC[1/628]: 0x4a16b0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:10:43.545 NEW_FUNC[2/628]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:10:43.545 #6 NEW cov: 10714 ft: 10678 corp: 2/18b lim: 90 exec/s: 0 rss: 68Mb L: 17/17 MS: 4 ChangeByte-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:10:43.545 [2024-06-11 12:04:56.538242] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:43.545 [2024-06-11 12:04:56.538301] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:43.804 #8 NEW cov: 10729 ft: 13604 corp: 3/67b lim: 90 exec/s: 0 rss: 69Mb L: 49/49 MS: 2 CopyPart-InsertRepeatedBytes- 00:10:43.804 [2024-06-11 12:04:56.791673] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:43.804 [2024-06-11 12:04:56.791716] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:44.063 NEW_FUNC[1/1]: 0x194cf80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:44.063 #9 NEW cov: 10746 ft: 14262 corp: 4/117b lim: 90 exec/s: 9 rss: 70Mb L: 50/50 MS: 1 InsertByte- 00:10:44.063 [2024-06-11 12:04:57.033805] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.063 [2024-06-11 12:04:57.033847] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:44.322 #10 NEW cov: 10746 ft: 15013 corp: 5/167b lim: 90 exec/s: 10 rss: 70Mb L: 50/50 MS: 1 ChangeBit- 00:10:44.322 [2024-06-11 12:04:57.275581] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.322 [2024-06-11 12:04:57.275623] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:44.581 #11 NEW cov: 10746 ft: 15117 corp: 6/200b lim: 90 exec/s: 11 rss: 70Mb L: 33/50 MS: 1 EraseBytes- 00:10:44.581 [2024-06-11 12:04:57.517497] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.581 [2024-06-11 12:04:57.517538] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:44.840 #17 NEW cov: 10746 ft: 15514 corp: 7/257b lim: 90 exec/s: 17 rss: 70Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:10:44.840 [2024-06-11 12:04:57.760881] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:44.840 [2024-06-11 12:04:57.760923] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.099 #18 NEW cov: 10753 ft: 15664 corp: 8/346b lim: 90 exec/s: 18 rss: 70Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:10:45.099 [2024-06-11 12:04:58.004238] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:45.099 [2024-06-11 12:04:58.004284] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:45.358 #24 NEW cov: 10753 ft: 15796 corp: 9/371b lim: 90 exec/s: 12 rss: 70Mb L: 25/89 MS: 1 EraseBytes- 00:10:45.358 #24 DONE cov: 10753 ft: 15796 corp: 9/371b lim: 90 exec/s: 12 rss: 70Mb 00:10:45.358 Done 24 runs in 2 second(s) 00:10:45.618 12:04:58 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:10:45.618 12:04:58 -- ../common.sh@72 -- # (( i++ )) 00:10:45.618 12:04:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:45.618 12:04:58 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:10:45.618 00:10:45.618 real 0m20.570s 00:10:45.618 user 0m27.985s 00:10:45.618 sys 0m2.336s 00:10:45.618 12:04:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:45.618 12:04:58 -- common/autotest_common.sh@10 -- # set +x 00:10:45.618 ************************************ 00:10:45.618 END TEST vfio_fuzz 00:10:45.618 ************************************ 00:10:45.618 12:04:58 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:10:45.618 00:10:45.618 real 1m25.825s 00:10:45.618 user 2m5.204s 00:10:45.618 sys 0m11.279s 00:10:45.618 12:04:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:45.618 12:04:58 -- common/autotest_common.sh@10 -- # set +x 00:10:45.618 ************************************ 00:10:45.618 END TEST llvm_fuzz 00:10:45.618 ************************************ 00:10:45.618 12:04:58 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:10:45.618 12:04:58 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:10:45.618 12:04:58 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:10:45.618 12:04:58 -- common/autotest_common.sh@712 -- # xtrace_disable 00:10:45.618 12:04:58 -- common/autotest_common.sh@10 -- # set +x 00:10:45.618 12:04:58 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:10:45.618 12:04:58 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:10:45.618 12:04:58 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:10:45.618 12:04:58 -- common/autotest_common.sh@10 -- # set +x 00:10:50.893 INFO: APP EXITING 00:10:50.893 INFO: killing all VMs 00:10:50.893 INFO: killing vhost app 00:10:50.893 WARN: no vhost pid file found 00:10:50.893 INFO: EXIT DONE 00:10:54.181 Waiting for block devices as requested 00:10:54.181 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:10:54.181 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:54.181 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:54.181 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:54.439 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:54.439 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:54.439 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:54.698 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:54.698 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:54.698 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:54.956 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:54.957 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:54.957 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:55.216 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:55.216 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:55.216 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:55.475 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:11:02.044 Cleaning 00:11:02.044 Removing: /dev/shm/spdk_tgt_trace.pid2672166 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2669752 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2670896 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2672166 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2672854 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2673079 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2673322 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2673585 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2673959 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2674133 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2674296 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2674532 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2675008 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2677618 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2677960 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2678196 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2678355 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2678917 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2679038 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2679503 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2679680 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2679891 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2680071 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2680233 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2680296 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2680748 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2680944 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2681141 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2681374 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2681595 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2681614 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2681767 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2681966 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2682213 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2682401 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2682597 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2682777 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2682976 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2683156 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2683358 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2683536 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2683731 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2683919 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2684117 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2684295 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2684500 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2684678 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2684879 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2685059 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2685258 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2685442 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2685675 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2685935 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2686170 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2686501 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2686978 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2687191 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2687418 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2687619 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2687841 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2688021 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2688220 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2688403 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2688602 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2688785 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2688984 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2689173 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2689370 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2689559 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2689756 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2689937 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2690141 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2690362 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2690549 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2691083 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2691448 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2691734 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2692093 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2692454 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2692823 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2693184 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2693546 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2693909 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2694276 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2694585 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2694899 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2695207 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2695583 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2695943 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2696309 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2696672 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2697036 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2697400 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2697767 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2698134 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2698452 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2698786 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2699092 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2699431 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2699946 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2700388 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2700759 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2701130 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2701498 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2701869 00:11:02.044 Removing: /var/run/dpdk/spdk_pid2702245 00:11:02.044 Clean 00:11:02.044 killing process with pid 2618441 00:11:03.947 killing process with pid 2618438 00:11:03.947 killing process with pid 2618440 00:11:03.947 killing process with pid 2618439 00:11:03.947 12:05:16 -- common/autotest_common.sh@1436 -- # return 0 00:11:03.947 12:05:16 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:11:03.947 12:05:16 -- common/autotest_common.sh@718 -- # xtrace_disable 00:11:03.947 12:05:16 -- common/autotest_common.sh@10 -- # set +x 00:11:03.947 12:05:16 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:11:03.947 12:05:16 -- common/autotest_common.sh@718 -- # xtrace_disable 00:11:03.947 12:05:16 -- common/autotest_common.sh@10 -- # set +x 00:11:04.207 12:05:16 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:11:04.207 12:05:16 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:11:04.207 12:05:16 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:11:04.207 12:05:16 -- spdk/autotest.sh@394 -- # hash lcov 00:11:04.207 12:05:16 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:11:04.207 12:05:17 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:11:04.207 12:05:17 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:11:04.207 12:05:17 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:04.207 12:05:17 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:04.207 12:05:17 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:04.207 12:05:17 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:04.207 12:05:17 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:04.207 12:05:17 -- paths/export.sh@5 -- $ export PATH 00:11:04.207 12:05:17 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:04.207 12:05:17 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:11:04.207 12:05:17 -- common/autobuild_common.sh@435 -- $ date +%s 00:11:04.207 12:05:17 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1718100317.XXXXXX 00:11:04.207 12:05:17 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1718100317.jMNo3X 00:11:04.207 12:05:17 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:11:04.207 12:05:17 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:11:04.207 12:05:17 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:11:04.207 12:05:17 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:11:04.207 12:05:17 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:11:04.207 12:05:17 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:11:04.207 12:05:17 -- common/autobuild_common.sh@451 -- $ get_config_params 00:11:04.207 12:05:17 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:11:04.207 12:05:17 -- common/autotest_common.sh@10 -- $ set +x 00:11:04.207 12:05:17 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:11:04.207 12:05:17 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:11:04.207 12:05:17 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:04.207 12:05:17 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:11:04.207 12:05:17 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:11:04.207 12:05:17 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:11:04.207 12:05:17 -- spdk/autopackage.sh@19 -- $ timing_finish 00:11:04.207 12:05:17 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:11:04.207 12:05:17 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:11:04.207 12:05:17 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:11:04.207 12:05:17 -- spdk/autopackage.sh@20 -- $ exit 0 00:11:04.207 + [[ -n 2562275 ]] 00:11:04.207 + sudo kill 2562275 00:11:04.217 [Pipeline] } 00:11:04.235 [Pipeline] // stage 00:11:04.240 [Pipeline] } 00:11:04.261 [Pipeline] // timeout 00:11:04.266 [Pipeline] } 00:11:04.284 [Pipeline] // catchError 00:11:04.291 [Pipeline] } 00:11:04.309 [Pipeline] // wrap 00:11:04.314 [Pipeline] } 00:11:04.331 [Pipeline] // catchError 00:11:04.340 [Pipeline] stage 00:11:04.342 [Pipeline] { (Epilogue) 00:11:04.356 [Pipeline] catchError 00:11:04.358 [Pipeline] { 00:11:04.372 [Pipeline] echo 00:11:04.374 Cleanup processes 00:11:04.379 [Pipeline] sh 00:11:04.663 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:04.663 2709591 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:04.678 [Pipeline] sh 00:11:04.964 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:11:04.964 ++ grep -v 'sudo pgrep' 00:11:04.964 ++ awk '{print $1}' 00:11:04.964 + sudo kill -9 00:11:04.964 + true 00:11:04.976 [Pipeline] sh 00:11:05.259 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:11:06.649 [Pipeline] sh 00:11:06.932 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:11:06.932 Artifacts sizes are good 00:11:06.949 [Pipeline] archiveArtifacts 00:11:06.957 Archiving artifacts 00:11:07.037 [Pipeline] sh 00:11:07.350 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:11:07.368 [Pipeline] cleanWs 00:11:07.378 [WS-CLEANUP] Deleting project workspace... 00:11:07.378 [WS-CLEANUP] Deferred wipeout is used... 00:11:07.385 [WS-CLEANUP] done 00:11:07.387 [Pipeline] } 00:11:07.409 [Pipeline] // catchError 00:11:07.423 [Pipeline] sh 00:11:07.705 + logger -p user.info -t JENKINS-CI 00:11:07.713 [Pipeline] } 00:11:07.731 [Pipeline] // stage 00:11:07.736 [Pipeline] } 00:11:07.753 [Pipeline] // node 00:11:07.760 [Pipeline] End of Pipeline 00:11:07.803 Finished: SUCCESS