00:00:00.002 Started by upstream project "autotest-spdk-master-vs-dpdk-main" build number 3400 00:00:00.002 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3011 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.017 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.032 The recommended git tool is: git 00:00:00.032 using credential 00000000-0000-0000-0000-000000000002 00:00:00.038 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.048 Fetching changes from the remote Git repository 00:00:00.050 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.064 Using shallow fetch with depth 1 00:00:00.064 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.064 > git --version # timeout=10 00:00:00.079 > git --version # 'git version 2.39.2' 00:00:00.079 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.083 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.083 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.879 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.889 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.900 Checking out Revision 6201031def5bfb7f90a861bc162998684798607e (FETCH_HEAD) 00:00:03.900 > git config core.sparsecheckout # timeout=10 00:00:03.909 > git read-tree -mu HEAD # timeout=10 00:00:03.924 > git checkout -f 6201031def5bfb7f90a861bc162998684798607e # timeout=5 00:00:03.941 Commit message: "scripts/kid: Add issue 3354" 00:00:03.941 > git rev-list --no-walk 6201031def5bfb7f90a861bc162998684798607e # timeout=10 00:00:04.041 [Pipeline] Start of Pipeline 00:00:04.052 [Pipeline] library 00:00:04.054 Loading library shm_lib@master 00:00:04.054 Library shm_lib@master is cached. Copying from home. 00:00:04.067 [Pipeline] node 00:00:04.081 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:04.082 [Pipeline] { 00:00:04.092 [Pipeline] catchError 00:00:04.094 [Pipeline] { 00:00:04.105 [Pipeline] wrap 00:00:04.113 [Pipeline] { 00:00:04.119 [Pipeline] stage 00:00:04.120 [Pipeline] { (Prologue) 00:00:04.321 [Pipeline] sh 00:00:04.601 + logger -p user.info -t JENKINS-CI 00:00:04.617 [Pipeline] echo 00:00:04.619 Node: WFP20 00:00:04.625 [Pipeline] sh 00:00:04.921 [Pipeline] setCustomBuildProperty 00:00:04.933 [Pipeline] echo 00:00:04.934 Cleanup processes 00:00:04.939 [Pipeline] sh 00:00:05.220 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.220 47086 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.231 [Pipeline] sh 00:00:05.516 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.516 ++ grep -v 'sudo pgrep' 00:00:05.516 ++ awk '{print $1}' 00:00:05.516 + sudo kill -9 00:00:05.516 + true 00:00:05.530 [Pipeline] cleanWs 00:00:05.540 [WS-CLEANUP] Deleting project workspace... 00:00:05.540 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.546 [WS-CLEANUP] done 00:00:05.550 [Pipeline] setCustomBuildProperty 00:00:05.565 [Pipeline] sh 00:00:05.848 + sudo git config --global --replace-all safe.directory '*' 00:00:05.924 [Pipeline] nodesByLabel 00:00:05.925 Found a total of 1 nodes with the 'sorcerer' label 00:00:05.935 [Pipeline] httpRequest 00:00:05.939 HttpMethod: GET 00:00:05.940 URL: http://10.211.164.96/packages/jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:05.943 Sending request to url: http://10.211.164.96/packages/jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:05.964 Response Code: HTTP/1.1 200 OK 00:00:05.964 Success: Status code 200 is in the accepted range: 200,404 00:00:05.965 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:29.198 [Pipeline] sh 00:00:29.474 + tar --no-same-owner -xf jbp_6201031def5bfb7f90a861bc162998684798607e.tar.gz 00:00:29.491 [Pipeline] httpRequest 00:00:29.495 HttpMethod: GET 00:00:29.496 URL: http://10.211.164.96/packages/spdk_06472fb6d0c234046253a9989fef790e0cbb219e.tar.gz 00:00:29.496 Sending request to url: http://10.211.164.96/packages/spdk_06472fb6d0c234046253a9989fef790e0cbb219e.tar.gz 00:00:29.514 Response Code: HTTP/1.1 200 OK 00:00:29.514 Success: Status code 200 is in the accepted range: 200,404 00:00:29.515 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_06472fb6d0c234046253a9989fef790e0cbb219e.tar.gz 00:01:34.502 [Pipeline] sh 00:01:34.788 + tar --no-same-owner -xf spdk_06472fb6d0c234046253a9989fef790e0cbb219e.tar.gz 00:01:37.333 [Pipeline] sh 00:01:37.612 + git -C spdk log --oneline -n5 00:01:37.612 06472fb6d lib/idxd: fix batch size in kernel IDXD 00:01:37.612 44dcf4fb9 pkgdep/idxd: Add dependency for accel-config used in kernel IDXD 00:01:37.612 3dbaa93c1 nvmf: pass command dword 12 and 13 for write 00:01:37.612 19327fc3a bdev/nvme: use dtype/dspec for write commands 00:01:37.612 c11e5c113 bdev: introduce bdev_nvme_cdw12 and cdw13, and add them to ext_opts 00:01:37.626 [Pipeline] withCredentials 00:01:37.634 > git --version # timeout=10 00:01:37.643 > git --version # 'git version 2.39.2' 00:01:37.658 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:37.660 [Pipeline] { 00:01:37.667 [Pipeline] retry 00:01:37.668 [Pipeline] { 00:01:37.679 [Pipeline] sh 00:01:37.952 + git ls-remote http://dpdk.org/git/dpdk main 00:01:37.964 [Pipeline] } 00:01:37.983 [Pipeline] // retry 00:01:37.988 [Pipeline] } 00:01:38.005 [Pipeline] // withCredentials 00:01:38.016 [Pipeline] httpRequest 00:01:38.020 HttpMethod: GET 00:01:38.020 URL: http://10.211.164.96/packages/dpdk_7e06c0de1952d3109a5b0c4779d7e7d8059c9d78.tar.gz 00:01:38.021 Sending request to url: http://10.211.164.96/packages/dpdk_7e06c0de1952d3109a5b0c4779d7e7d8059c9d78.tar.gz 00:01:38.023 Response Code: HTTP/1.1 200 OK 00:01:38.023 Success: Status code 200 is in the accepted range: 200,404 00:01:38.023 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_7e06c0de1952d3109a5b0c4779d7e7d8059c9d78.tar.gz 00:01:41.827 [Pipeline] sh 00:01:42.107 + tar --no-same-owner -xf dpdk_7e06c0de1952d3109a5b0c4779d7e7d8059c9d78.tar.gz 00:01:43.058 [Pipeline] sh 00:01:43.342 + git -C dpdk log --oneline -n5 00:01:43.342 7e06c0de19 examples: move alignment attribute on types for MSVC 00:01:43.342 27595cd830 drivers: move alignment attribute on types for MSVC 00:01:43.342 0efea35a2b app: move alignment attribute on types for MSVC 00:01:43.342 e2e546ab5b version: 24.07-rc0 00:01:43.342 a9778aad62 version: 24.03.0 00:01:43.353 [Pipeline] } 00:01:43.370 [Pipeline] // stage 00:01:43.379 [Pipeline] stage 00:01:43.381 [Pipeline] { (Prepare) 00:01:43.404 [Pipeline] writeFile 00:01:43.423 [Pipeline] sh 00:01:43.710 + logger -p user.info -t JENKINS-CI 00:01:43.723 [Pipeline] sh 00:01:44.009 + logger -p user.info -t JENKINS-CI 00:01:44.022 [Pipeline] sh 00:01:44.306 + cat autorun-spdk.conf 00:01:44.306 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:44.306 SPDK_RUN_UBSAN=1 00:01:44.306 SPDK_TEST_FUZZER=1 00:01:44.306 SPDK_TEST_FUZZER_SHORT=1 00:01:44.306 SPDK_TEST_NATIVE_DPDK=main 00:01:44.306 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:44.313 RUN_NIGHTLY=1 00:01:44.317 [Pipeline] readFile 00:01:44.339 [Pipeline] withEnv 00:01:44.341 [Pipeline] { 00:01:44.354 [Pipeline] sh 00:01:44.639 + set -ex 00:01:44.639 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:44.639 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:44.639 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:44.639 ++ SPDK_RUN_UBSAN=1 00:01:44.639 ++ SPDK_TEST_FUZZER=1 00:01:44.639 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:44.639 ++ SPDK_TEST_NATIVE_DPDK=main 00:01:44.639 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:44.639 ++ RUN_NIGHTLY=1 00:01:44.639 + case $SPDK_TEST_NVMF_NICS in 00:01:44.639 + DRIVERS= 00:01:44.639 + [[ -n '' ]] 00:01:44.639 + exit 0 00:01:44.649 [Pipeline] } 00:01:44.666 [Pipeline] // withEnv 00:01:44.671 [Pipeline] } 00:01:44.688 [Pipeline] // stage 00:01:44.697 [Pipeline] catchError 00:01:44.699 [Pipeline] { 00:01:44.715 [Pipeline] timeout 00:01:44.715 Timeout set to expire in 30 min 00:01:44.717 [Pipeline] { 00:01:44.732 [Pipeline] stage 00:01:44.734 [Pipeline] { (Tests) 00:01:44.750 [Pipeline] sh 00:01:45.033 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.034 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.034 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.034 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:45.034 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:45.034 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:45.034 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:45.034 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:45.034 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:45.034 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:45.034 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.034 + source /etc/os-release 00:01:45.034 ++ NAME='Fedora Linux' 00:01:45.034 ++ VERSION='38 (Cloud Edition)' 00:01:45.034 ++ ID=fedora 00:01:45.034 ++ VERSION_ID=38 00:01:45.034 ++ VERSION_CODENAME= 00:01:45.034 ++ PLATFORM_ID=platform:f38 00:01:45.034 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:45.034 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:45.034 ++ LOGO=fedora-logo-icon 00:01:45.034 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:45.034 ++ HOME_URL=https://fedoraproject.org/ 00:01:45.034 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:45.034 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:45.034 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:45.034 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:45.034 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:45.034 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:45.034 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:45.034 ++ SUPPORT_END=2024-05-14 00:01:45.034 ++ VARIANT='Cloud Edition' 00:01:45.034 ++ VARIANT_ID=cloud 00:01:45.034 + uname -a 00:01:45.034 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:45.034 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:48.325 Hugepages 00:01:48.325 node hugesize free / total 00:01:48.325 node0 1048576kB 0 / 0 00:01:48.325 node0 2048kB 0 / 0 00:01:48.325 node1 1048576kB 0 / 0 00:01:48.325 node1 2048kB 0 / 0 00:01:48.325 00:01:48.325 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:48.325 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:48.325 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:48.325 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:48.325 + rm -f /tmp/spdk-ld-path 00:01:48.325 + source autorun-spdk.conf 00:01:48.325 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:48.325 ++ SPDK_RUN_UBSAN=1 00:01:48.325 ++ SPDK_TEST_FUZZER=1 00:01:48.325 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:48.325 ++ SPDK_TEST_NATIVE_DPDK=main 00:01:48.325 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.325 ++ RUN_NIGHTLY=1 00:01:48.325 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:48.326 + [[ -n '' ]] 00:01:48.326 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:48.326 + for M in /var/spdk/build-*-manifest.txt 00:01:48.326 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:48.326 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:48.326 + for M in /var/spdk/build-*-manifest.txt 00:01:48.326 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:48.326 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:48.326 ++ uname 00:01:48.326 + [[ Linux == \L\i\n\u\x ]] 00:01:48.326 + sudo dmesg -T 00:01:48.326 + sudo dmesg --clear 00:01:48.326 + dmesg_pid=48560 00:01:48.326 + [[ Fedora Linux == FreeBSD ]] 00:01:48.326 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:48.326 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:48.326 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:48.326 + [[ -x /usr/src/fio-static/fio ]] 00:01:48.326 + export FIO_BIN=/usr/src/fio-static/fio 00:01:48.326 + sudo dmesg -Tw 00:01:48.326 + FIO_BIN=/usr/src/fio-static/fio 00:01:48.326 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:48.326 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:48.326 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:48.326 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:48.326 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:48.326 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:48.326 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:48.326 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:48.326 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:48.326 Test configuration: 00:01:48.326 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:48.326 SPDK_RUN_UBSAN=1 00:01:48.326 SPDK_TEST_FUZZER=1 00:01:48.326 SPDK_TEST_FUZZER_SHORT=1 00:01:48.326 SPDK_TEST_NATIVE_DPDK=main 00:01:48.326 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.326 RUN_NIGHTLY=1 20:49:03 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:48.326 20:49:03 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:48.326 20:49:03 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:48.326 20:49:03 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:48.326 20:49:03 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.326 20:49:03 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.326 20:49:03 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.326 20:49:03 -- paths/export.sh@5 -- $ export PATH 00:01:48.326 20:49:03 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.326 20:49:03 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:48.326 20:49:03 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:48.326 20:49:03 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714070943.XXXXXX 00:01:48.326 20:49:03 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714070943.RwvhfX 00:01:48.326 20:49:03 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:48.326 20:49:03 -- common/autobuild_common.sh@441 -- $ '[' -n main ']' 00:01:48.326 20:49:03 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.326 20:49:03 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:48.326 20:49:03 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:48.326 20:49:03 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:48.326 20:49:03 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:48.326 20:49:03 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:01:48.326 20:49:03 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.326 20:49:03 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:48.326 20:49:03 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:01:48.326 20:49:03 -- pm/common@17 -- $ local monitor 00:01:48.326 20:49:03 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.326 20:49:03 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=48598 00:01:48.326 20:49:03 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.326 20:49:03 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=48600 00:01:48.326 20:49:03 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.326 20:49:03 -- pm/common@21 -- $ date +%s 00:01:48.326 20:49:03 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=48602 00:01:48.326 20:49:03 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:48.326 20:49:03 -- pm/common@21 -- $ date +%s 00:01:48.326 20:49:03 -- pm/common@21 -- $ date +%s 00:01:48.326 20:49:03 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=48605 00:01:48.326 20:49:03 -- pm/common@26 -- $ sleep 1 00:01:48.326 20:49:03 -- pm/common@21 -- $ date +%s 00:01:48.326 20:49:03 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714070943 00:01:48.326 20:49:03 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714070943 00:01:48.326 20:49:03 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714070943 00:01:48.326 20:49:03 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714070943 00:01:48.326 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714070943_collect-cpu-temp.pm.log 00:01:48.326 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714070943_collect-vmstat.pm.log 00:01:48.326 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714070943_collect-bmc-pm.bmc.pm.log 00:01:48.326 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714070943_collect-cpu-load.pm.log 00:01:49.338 20:49:04 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:01:49.338 20:49:04 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:49.338 20:49:04 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:49.338 20:49:04 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:49.338 20:49:04 -- spdk/autobuild.sh@16 -- $ date -u 00:01:49.338 Thu Apr 25 06:49:04 PM UTC 2024 00:01:49.338 20:49:04 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:49.338 v24.05-pre-448-g06472fb6d 00:01:49.338 20:49:04 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:49.338 20:49:04 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:49.338 20:49:04 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:49.338 20:49:04 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:49.338 20:49:04 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:49.338 20:49:04 -- common/autotest_common.sh@10 -- $ set +x 00:01:49.599 ************************************ 00:01:49.599 START TEST ubsan 00:01:49.599 ************************************ 00:01:49.599 20:49:05 -- common/autotest_common.sh@1111 -- $ echo 'using ubsan' 00:01:49.599 using ubsan 00:01:49.599 00:01:49.599 real 0m0.000s 00:01:49.599 user 0m0.000s 00:01:49.599 sys 0m0.000s 00:01:49.599 20:49:05 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:01:49.599 20:49:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:49.599 ************************************ 00:01:49.599 END TEST ubsan 00:01:49.599 ************************************ 00:01:49.599 20:49:05 -- spdk/autobuild.sh@27 -- $ '[' -n main ']' 00:01:49.599 20:49:05 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:49.599 20:49:05 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:49.599 20:49:05 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:49.599 20:49:05 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:49.599 20:49:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:49.599 ************************************ 00:01:49.599 START TEST build_native_dpdk 00:01:49.599 ************************************ 00:01:49.599 20:49:05 -- common/autotest_common.sh@1111 -- $ _build_native_dpdk 00:01:49.599 20:49:05 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:49.599 20:49:05 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:49.599 20:49:05 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:49.599 20:49:05 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:49.599 20:49:05 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:49.599 20:49:05 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:49.599 20:49:05 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:49.599 20:49:05 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:49.599 20:49:05 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:49.599 20:49:05 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:49.599 20:49:05 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:49.599 20:49:05 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:49.599 20:49:05 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:49.599 20:49:05 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:49.599 20:49:05 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:49.599 20:49:05 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:49.599 20:49:05 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:49.599 7e06c0de19 examples: move alignment attribute on types for MSVC 00:01:49.599 27595cd830 drivers: move alignment attribute on types for MSVC 00:01:49.599 0efea35a2b app: move alignment attribute on types for MSVC 00:01:49.599 e2e546ab5b version: 24.07-rc0 00:01:49.599 a9778aad62 version: 24.03.0 00:01:49.599 20:49:05 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:49.599 20:49:05 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:49.599 20:49:05 -- common/autobuild_common.sh@87 -- $ dpdk_ver=24.07.0-rc0 00:01:49.599 20:49:05 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:49.599 20:49:05 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:49.599 20:49:05 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:49.599 20:49:05 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:49.599 20:49:05 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:49.599 20:49:05 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:49.600 20:49:05 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:49.600 20:49:05 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:49.600 20:49:05 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:49.600 20:49:05 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:49.600 20:49:05 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:49.600 20:49:05 -- common/autobuild_common.sh@169 -- $ lt 24.07.0-rc0 21.11.0 00:01:49.600 20:49:05 -- scripts/common.sh@370 -- $ cmp_versions 24.07.0-rc0 '<' 21.11.0 00:01:49.600 20:49:05 -- scripts/common.sh@330 -- $ local ver1 ver1_l 00:01:49.600 20:49:05 -- scripts/common.sh@331 -- $ local ver2 ver2_l 00:01:49.600 20:49:05 -- scripts/common.sh@333 -- $ IFS=.-: 00:01:49.600 20:49:05 -- scripts/common.sh@333 -- $ read -ra ver1 00:01:49.600 20:49:05 -- scripts/common.sh@334 -- $ IFS=.-: 00:01:49.600 20:49:05 -- scripts/common.sh@334 -- $ read -ra ver2 00:01:49.600 20:49:05 -- scripts/common.sh@335 -- $ local 'op=<' 00:01:49.600 20:49:05 -- scripts/common.sh@337 -- $ ver1_l=4 00:01:49.600 20:49:05 -- scripts/common.sh@338 -- $ ver2_l=3 00:01:49.600 20:49:05 -- scripts/common.sh@340 -- $ local lt=0 gt=0 eq=0 v 00:01:49.600 20:49:05 -- scripts/common.sh@341 -- $ case "$op" in 00:01:49.600 20:49:05 -- scripts/common.sh@342 -- $ : 1 00:01:49.600 20:49:05 -- scripts/common.sh@361 -- $ (( v = 0 )) 00:01:49.600 20:49:05 -- scripts/common.sh@361 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:49.600 20:49:05 -- scripts/common.sh@362 -- $ decimal 24 00:01:49.600 20:49:05 -- scripts/common.sh@350 -- $ local d=24 00:01:49.600 20:49:05 -- scripts/common.sh@351 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:49.600 20:49:05 -- scripts/common.sh@352 -- $ echo 24 00:01:49.600 20:49:05 -- scripts/common.sh@362 -- $ ver1[v]=24 00:01:49.600 20:49:05 -- scripts/common.sh@363 -- $ decimal 21 00:01:49.600 20:49:05 -- scripts/common.sh@350 -- $ local d=21 00:01:49.600 20:49:05 -- scripts/common.sh@351 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:49.600 20:49:05 -- scripts/common.sh@352 -- $ echo 21 00:01:49.600 20:49:05 -- scripts/common.sh@363 -- $ ver2[v]=21 00:01:49.600 20:49:05 -- scripts/common.sh@364 -- $ (( ver1[v] > ver2[v] )) 00:01:49.600 20:49:05 -- scripts/common.sh@364 -- $ return 1 00:01:49.600 20:49:05 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:49.600 patching file config/rte_config.h 00:01:49.600 Hunk #1 succeeded at 70 (offset 11 lines). 00:01:49.600 20:49:05 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:49.600 20:49:05 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:49.600 20:49:05 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:49.600 20:49:05 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:49.600 20:49:05 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:54.931 The Meson build system 00:01:54.931 Version: 1.3.1 00:01:54.931 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:54.931 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:54.931 Build type: native build 00:01:54.931 Program cat found: YES (/usr/bin/cat) 00:01:54.931 Project name: DPDK 00:01:54.931 Project version: 24.07.0-rc0 00:01:54.931 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:54.931 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:54.931 Host machine cpu family: x86_64 00:01:54.931 Host machine cpu: x86_64 00:01:54.931 Message: ## Building in Developer Mode ## 00:01:54.931 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:54.931 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:54.931 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:54.931 Program python3 found: YES (/usr/bin/python3) 00:01:54.931 Program cat found: YES (/usr/bin/cat) 00:01:54.931 config/meson.build:120: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:54.931 Compiler for C supports arguments -march=native: YES 00:01:54.931 Checking for size of "void *" : 8 00:01:54.931 Checking for size of "void *" : 8 (cached) 00:01:54.931 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:54.931 Library m found: YES 00:01:54.931 Library numa found: YES 00:01:54.931 Has header "numaif.h" : YES 00:01:54.931 Library fdt found: NO 00:01:54.931 Library execinfo found: NO 00:01:54.931 Has header "execinfo.h" : YES 00:01:54.931 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:54.931 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:54.931 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:54.931 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:54.931 Run-time dependency openssl found: YES 3.0.9 00:01:54.931 Run-time dependency libpcap found: YES 1.10.4 00:01:54.931 Has header "pcap.h" with dependency libpcap: YES 00:01:54.931 Compiler for C supports arguments -Wcast-qual: YES 00:01:54.931 Compiler for C supports arguments -Wdeprecated: YES 00:01:54.931 Compiler for C supports arguments -Wformat: YES 00:01:54.931 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:54.931 Compiler for C supports arguments -Wformat-security: NO 00:01:54.931 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:54.931 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:54.931 Compiler for C supports arguments -Wnested-externs: YES 00:01:54.931 Compiler for C supports arguments -Wold-style-definition: YES 00:01:54.931 Compiler for C supports arguments -Wpointer-arith: YES 00:01:54.931 Compiler for C supports arguments -Wsign-compare: YES 00:01:54.931 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:54.931 Compiler for C supports arguments -Wundef: YES 00:01:54.931 Compiler for C supports arguments -Wwrite-strings: YES 00:01:54.931 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:54.931 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:54.931 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:54.931 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:54.931 Program objdump found: YES (/usr/bin/objdump) 00:01:54.931 Compiler for C supports arguments -mavx512f: YES 00:01:54.931 Checking if "AVX512 checking" compiles: YES 00:01:54.931 Fetching value of define "__SSE4_2__" : 1 00:01:54.931 Fetching value of define "__AES__" : 1 00:01:54.931 Fetching value of define "__AVX__" : 1 00:01:54.931 Fetching value of define "__AVX2__" : 1 00:01:54.931 Fetching value of define "__AVX512BW__" : 1 00:01:54.931 Fetching value of define "__AVX512CD__" : 1 00:01:54.931 Fetching value of define "__AVX512DQ__" : 1 00:01:54.931 Fetching value of define "__AVX512F__" : 1 00:01:54.931 Fetching value of define "__AVX512VL__" : 1 00:01:54.931 Fetching value of define "__PCLMUL__" : 1 00:01:54.931 Fetching value of define "__RDRND__" : 1 00:01:54.931 Fetching value of define "__RDSEED__" : 1 00:01:54.931 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:54.931 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:54.931 Message: lib/log: Defining dependency "log" 00:01:54.931 Message: lib/kvargs: Defining dependency "kvargs" 00:01:54.931 Message: lib/argparse: Defining dependency "argparse" 00:01:54.931 Message: lib/telemetry: Defining dependency "telemetry" 00:01:54.931 Checking for function "getentropy" : NO 00:01:54.931 Message: lib/eal: Defining dependency "eal" 00:01:54.931 Message: lib/ring: Defining dependency "ring" 00:01:54.931 Message: lib/rcu: Defining dependency "rcu" 00:01:54.931 Message: lib/mempool: Defining dependency "mempool" 00:01:54.931 Message: lib/mbuf: Defining dependency "mbuf" 00:01:54.931 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:54.931 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:54.931 Compiler for C supports arguments -mpclmul: YES 00:01:54.931 Compiler for C supports arguments -maes: YES 00:01:54.931 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:54.931 Compiler for C supports arguments -mavx512bw: YES 00:01:54.931 Compiler for C supports arguments -mavx512dq: YES 00:01:54.931 Compiler for C supports arguments -mavx512vl: YES 00:01:54.931 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:54.931 Compiler for C supports arguments -mavx2: YES 00:01:54.931 Compiler for C supports arguments -mavx: YES 00:01:54.931 Message: lib/net: Defining dependency "net" 00:01:54.931 Message: lib/meter: Defining dependency "meter" 00:01:54.931 Message: lib/ethdev: Defining dependency "ethdev" 00:01:54.931 Message: lib/pci: Defining dependency "pci" 00:01:54.931 Message: lib/cmdline: Defining dependency "cmdline" 00:01:54.931 Message: lib/metrics: Defining dependency "metrics" 00:01:54.931 Message: lib/hash: Defining dependency "hash" 00:01:54.931 Message: lib/timer: Defining dependency "timer" 00:01:54.931 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:54.931 Message: lib/acl: Defining dependency "acl" 00:01:54.931 Message: lib/bbdev: Defining dependency "bbdev" 00:01:54.931 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:54.931 Run-time dependency libelf found: YES 0.190 00:01:54.931 Message: lib/bpf: Defining dependency "bpf" 00:01:54.931 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:54.931 Message: lib/compressdev: Defining dependency "compressdev" 00:01:54.931 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:54.931 Message: lib/distributor: Defining dependency "distributor" 00:01:54.931 Message: lib/dmadev: Defining dependency "dmadev" 00:01:54.931 Message: lib/efd: Defining dependency "efd" 00:01:54.931 Message: lib/eventdev: Defining dependency "eventdev" 00:01:54.931 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:54.931 Message: lib/gpudev: Defining dependency "gpudev" 00:01:54.931 Message: lib/gro: Defining dependency "gro" 00:01:54.931 Message: lib/gso: Defining dependency "gso" 00:01:54.931 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:54.931 Message: lib/jobstats: Defining dependency "jobstats" 00:01:54.931 Message: lib/latencystats: Defining dependency "latencystats" 00:01:54.931 Message: lib/lpm: Defining dependency "lpm" 00:01:54.931 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:54.931 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:54.931 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:54.931 Message: lib/member: Defining dependency "member" 00:01:54.931 Message: lib/pcapng: Defining dependency "pcapng" 00:01:54.931 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:54.931 Message: lib/power: Defining dependency "power" 00:01:54.931 Message: lib/rawdev: Defining dependency "rawdev" 00:01:54.931 Message: lib/regexdev: Defining dependency "regexdev" 00:01:54.931 Message: lib/mldev: Defining dependency "mldev" 00:01:54.931 Message: lib/rib: Defining dependency "rib" 00:01:54.931 Message: lib/reorder: Defining dependency "reorder" 00:01:54.931 Message: lib/sched: Defining dependency "sched" 00:01:54.931 Message: lib/security: Defining dependency "security" 00:01:54.931 Message: lib/stack: Defining dependency "stack" 00:01:54.931 Has header "linux/userfaultfd.h" : YES 00:01:54.931 Has header "linux/vduse.h" : YES 00:01:54.931 Message: lib/vhost: Defining dependency "vhost" 00:01:54.931 Message: lib/ipsec: Defining dependency "ipsec" 00:01:54.931 Message: lib/pdcp: Defining dependency "pdcp" 00:01:54.931 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.932 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:54.932 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:54.932 Message: lib/fib: Defining dependency "fib" 00:01:54.932 Message: lib/port: Defining dependency "port" 00:01:54.932 Message: lib/pdump: Defining dependency "pdump" 00:01:54.932 Message: lib/table: Defining dependency "table" 00:01:54.932 Message: lib/pipeline: Defining dependency "pipeline" 00:01:54.932 Message: lib/graph: Defining dependency "graph" 00:01:54.932 Message: lib/node: Defining dependency "node" 00:01:54.932 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:54.932 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:54.932 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:55.503 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:55.503 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:55.503 Compiler for C supports arguments -Wno-unused-value: YES 00:01:55.503 Compiler for C supports arguments -Wno-format: YES 00:01:55.503 Compiler for C supports arguments -Wno-format-security: YES 00:01:55.503 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:55.503 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:55.503 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:55.503 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:55.503 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:55.503 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:55.503 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:55.503 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:55.503 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:55.503 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:55.503 Has header "sys/epoll.h" : YES 00:01:55.503 Program doxygen found: YES (/usr/bin/doxygen) 00:01:55.503 Configuring doxy-api-html.conf using configuration 00:01:55.503 Configuring doxy-api-man.conf using configuration 00:01:55.503 Program mandb found: YES (/usr/bin/mandb) 00:01:55.503 Program sphinx-build found: NO 00:01:55.503 Configuring rte_build_config.h using configuration 00:01:55.503 Message: 00:01:55.503 ================= 00:01:55.503 Applications Enabled 00:01:55.503 ================= 00:01:55.503 00:01:55.503 apps: 00:01:55.503 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:55.503 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:55.503 test-pmd, test-regex, test-sad, test-security-perf, 00:01:55.503 00:01:55.503 Message: 00:01:55.503 ================= 00:01:55.503 Libraries Enabled 00:01:55.503 ================= 00:01:55.503 00:01:55.503 libs: 00:01:55.503 log, kvargs, argparse, telemetry, eal, ring, rcu, mempool, 00:01:55.503 mbuf, net, meter, ethdev, pci, cmdline, metrics, hash, 00:01:55.503 timer, acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, 00:01:55.503 distributor, dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, 00:01:55.503 ip_frag, jobstats, latencystats, lpm, member, pcapng, power, rawdev, 00:01:55.503 regexdev, mldev, rib, reorder, sched, security, stack, vhost, 00:01:55.503 ipsec, pdcp, fib, port, pdump, table, pipeline, graph, 00:01:55.503 node, 00:01:55.503 00:01:55.503 Message: 00:01:55.503 =============== 00:01:55.503 Drivers Enabled 00:01:55.503 =============== 00:01:55.503 00:01:55.503 common: 00:01:55.503 00:01:55.503 bus: 00:01:55.503 pci, vdev, 00:01:55.503 mempool: 00:01:55.503 ring, 00:01:55.503 dma: 00:01:55.503 00:01:55.503 net: 00:01:55.503 i40e, 00:01:55.503 raw: 00:01:55.503 00:01:55.503 crypto: 00:01:55.503 00:01:55.503 compress: 00:01:55.503 00:01:55.503 regex: 00:01:55.503 00:01:55.503 ml: 00:01:55.503 00:01:55.503 vdpa: 00:01:55.503 00:01:55.503 event: 00:01:55.503 00:01:55.503 baseband: 00:01:55.503 00:01:55.503 gpu: 00:01:55.503 00:01:55.503 00:01:55.503 Message: 00:01:55.503 ================= 00:01:55.503 Content Skipped 00:01:55.503 ================= 00:01:55.503 00:01:55.503 apps: 00:01:55.503 00:01:55.503 libs: 00:01:55.503 00:01:55.503 drivers: 00:01:55.503 common/cpt: not in enabled drivers build config 00:01:55.503 common/dpaax: not in enabled drivers build config 00:01:55.503 common/iavf: not in enabled drivers build config 00:01:55.503 common/idpf: not in enabled drivers build config 00:01:55.503 common/ionic: not in enabled drivers build config 00:01:55.503 common/mvep: not in enabled drivers build config 00:01:55.503 common/octeontx: not in enabled drivers build config 00:01:55.503 bus/auxiliary: not in enabled drivers build config 00:01:55.503 bus/cdx: not in enabled drivers build config 00:01:55.503 bus/dpaa: not in enabled drivers build config 00:01:55.503 bus/fslmc: not in enabled drivers build config 00:01:55.503 bus/ifpga: not in enabled drivers build config 00:01:55.503 bus/platform: not in enabled drivers build config 00:01:55.503 bus/uacce: not in enabled drivers build config 00:01:55.503 bus/vmbus: not in enabled drivers build config 00:01:55.503 common/cnxk: not in enabled drivers build config 00:01:55.503 common/mlx5: not in enabled drivers build config 00:01:55.503 common/nfp: not in enabled drivers build config 00:01:55.503 common/nitrox: not in enabled drivers build config 00:01:55.504 common/qat: not in enabled drivers build config 00:01:55.504 common/sfc_efx: not in enabled drivers build config 00:01:55.504 mempool/bucket: not in enabled drivers build config 00:01:55.504 mempool/cnxk: not in enabled drivers build config 00:01:55.504 mempool/dpaa: not in enabled drivers build config 00:01:55.504 mempool/dpaa2: not in enabled drivers build config 00:01:55.504 mempool/octeontx: not in enabled drivers build config 00:01:55.504 mempool/stack: not in enabled drivers build config 00:01:55.504 dma/cnxk: not in enabled drivers build config 00:01:55.504 dma/dpaa: not in enabled drivers build config 00:01:55.504 dma/dpaa2: not in enabled drivers build config 00:01:55.504 dma/hisilicon: not in enabled drivers build config 00:01:55.504 dma/idxd: not in enabled drivers build config 00:01:55.504 dma/ioat: not in enabled drivers build config 00:01:55.504 dma/skeleton: not in enabled drivers build config 00:01:55.504 net/af_packet: not in enabled drivers build config 00:01:55.504 net/af_xdp: not in enabled drivers build config 00:01:55.504 net/ark: not in enabled drivers build config 00:01:55.504 net/atlantic: not in enabled drivers build config 00:01:55.504 net/avp: not in enabled drivers build config 00:01:55.504 net/axgbe: not in enabled drivers build config 00:01:55.504 net/bnx2x: not in enabled drivers build config 00:01:55.504 net/bnxt: not in enabled drivers build config 00:01:55.504 net/bonding: not in enabled drivers build config 00:01:55.504 net/cnxk: not in enabled drivers build config 00:01:55.504 net/cpfl: not in enabled drivers build config 00:01:55.504 net/cxgbe: not in enabled drivers build config 00:01:55.504 net/dpaa: not in enabled drivers build config 00:01:55.504 net/dpaa2: not in enabled drivers build config 00:01:55.504 net/e1000: not in enabled drivers build config 00:01:55.504 net/ena: not in enabled drivers build config 00:01:55.504 net/enetc: not in enabled drivers build config 00:01:55.504 net/enetfec: not in enabled drivers build config 00:01:55.504 net/enic: not in enabled drivers build config 00:01:55.504 net/failsafe: not in enabled drivers build config 00:01:55.504 net/fm10k: not in enabled drivers build config 00:01:55.504 net/gve: not in enabled drivers build config 00:01:55.504 net/hinic: not in enabled drivers build config 00:01:55.504 net/hns3: not in enabled drivers build config 00:01:55.504 net/iavf: not in enabled drivers build config 00:01:55.504 net/ice: not in enabled drivers build config 00:01:55.504 net/idpf: not in enabled drivers build config 00:01:55.504 net/igc: not in enabled drivers build config 00:01:55.504 net/ionic: not in enabled drivers build config 00:01:55.504 net/ipn3ke: not in enabled drivers build config 00:01:55.504 net/ixgbe: not in enabled drivers build config 00:01:55.504 net/mana: not in enabled drivers build config 00:01:55.504 net/memif: not in enabled drivers build config 00:01:55.504 net/mlx4: not in enabled drivers build config 00:01:55.504 net/mlx5: not in enabled drivers build config 00:01:55.504 net/mvneta: not in enabled drivers build config 00:01:55.504 net/mvpp2: not in enabled drivers build config 00:01:55.504 net/netvsc: not in enabled drivers build config 00:01:55.504 net/nfb: not in enabled drivers build config 00:01:55.504 net/nfp: not in enabled drivers build config 00:01:55.504 net/ngbe: not in enabled drivers build config 00:01:55.504 net/null: not in enabled drivers build config 00:01:55.504 net/octeontx: not in enabled drivers build config 00:01:55.504 net/octeon_ep: not in enabled drivers build config 00:01:55.504 net/pcap: not in enabled drivers build config 00:01:55.504 net/pfe: not in enabled drivers build config 00:01:55.504 net/qede: not in enabled drivers build config 00:01:55.504 net/ring: not in enabled drivers build config 00:01:55.504 net/sfc: not in enabled drivers build config 00:01:55.504 net/softnic: not in enabled drivers build config 00:01:55.504 net/tap: not in enabled drivers build config 00:01:55.504 net/thunderx: not in enabled drivers build config 00:01:55.504 net/txgbe: not in enabled drivers build config 00:01:55.504 net/vdev_netvsc: not in enabled drivers build config 00:01:55.504 net/vhost: not in enabled drivers build config 00:01:55.504 net/virtio: not in enabled drivers build config 00:01:55.504 net/vmxnet3: not in enabled drivers build config 00:01:55.504 raw/cnxk_bphy: not in enabled drivers build config 00:01:55.504 raw/cnxk_gpio: not in enabled drivers build config 00:01:55.504 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:55.504 raw/ifpga: not in enabled drivers build config 00:01:55.504 raw/ntb: not in enabled drivers build config 00:01:55.504 raw/skeleton: not in enabled drivers build config 00:01:55.504 crypto/armv8: not in enabled drivers build config 00:01:55.504 crypto/bcmfs: not in enabled drivers build config 00:01:55.504 crypto/caam_jr: not in enabled drivers build config 00:01:55.504 crypto/ccp: not in enabled drivers build config 00:01:55.504 crypto/cnxk: not in enabled drivers build config 00:01:55.504 crypto/dpaa_sec: not in enabled drivers build config 00:01:55.504 crypto/dpaa2_sec: not in enabled drivers build config 00:01:55.504 crypto/ipsec_mb: not in enabled drivers build config 00:01:55.504 crypto/mlx5: not in enabled drivers build config 00:01:55.504 crypto/mvsam: not in enabled drivers build config 00:01:55.504 crypto/nitrox: not in enabled drivers build config 00:01:55.504 crypto/null: not in enabled drivers build config 00:01:55.504 crypto/octeontx: not in enabled drivers build config 00:01:55.504 crypto/openssl: not in enabled drivers build config 00:01:55.504 crypto/scheduler: not in enabled drivers build config 00:01:55.504 crypto/uadk: not in enabled drivers build config 00:01:55.504 crypto/virtio: not in enabled drivers build config 00:01:55.504 compress/isal: not in enabled drivers build config 00:01:55.504 compress/mlx5: not in enabled drivers build config 00:01:55.504 compress/nitrox: not in enabled drivers build config 00:01:55.504 compress/octeontx: not in enabled drivers build config 00:01:55.504 compress/zlib: not in enabled drivers build config 00:01:55.504 regex/mlx5: not in enabled drivers build config 00:01:55.504 regex/cn9k: not in enabled drivers build config 00:01:55.504 ml/cnxk: not in enabled drivers build config 00:01:55.504 vdpa/ifc: not in enabled drivers build config 00:01:55.504 vdpa/mlx5: not in enabled drivers build config 00:01:55.504 vdpa/nfp: not in enabled drivers build config 00:01:55.504 vdpa/sfc: not in enabled drivers build config 00:01:55.504 event/cnxk: not in enabled drivers build config 00:01:55.504 event/dlb2: not in enabled drivers build config 00:01:55.504 event/dpaa: not in enabled drivers build config 00:01:55.504 event/dpaa2: not in enabled drivers build config 00:01:55.504 event/dsw: not in enabled drivers build config 00:01:55.504 event/opdl: not in enabled drivers build config 00:01:55.504 event/skeleton: not in enabled drivers build config 00:01:55.504 event/sw: not in enabled drivers build config 00:01:55.504 event/octeontx: not in enabled drivers build config 00:01:55.504 baseband/acc: not in enabled drivers build config 00:01:55.504 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:55.504 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:55.504 baseband/la12xx: not in enabled drivers build config 00:01:55.504 baseband/null: not in enabled drivers build config 00:01:55.504 baseband/turbo_sw: not in enabled drivers build config 00:01:55.504 gpu/cuda: not in enabled drivers build config 00:01:55.504 00:01:55.504 00:01:55.504 Build targets in project: 221 00:01:55.504 00:01:55.504 DPDK 24.07.0-rc0 00:01:55.504 00:01:55.504 User defined options 00:01:55.504 libdir : lib 00:01:55.504 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:55.504 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:55.504 c_link_args : 00:01:55.504 enable_docs : false 00:01:55.504 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:55.504 enable_kmods : false 00:01:55.504 machine : native 00:01:55.504 tests : false 00:01:55.504 00:01:55.505 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:55.505 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:55.777 20:49:11 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:55.777 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:55.777 [1/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:55.777 [2/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:55.777 [3/719] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:55.777 [4/719] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:56.044 [5/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:56.044 [6/719] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:56.044 [7/719] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:56.044 [8/719] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:56.044 [9/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:56.044 [10/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:56.044 [11/719] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:56.044 [12/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:56.044 [13/719] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:56.044 [14/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:56.044 [15/719] Linking static target lib/librte_kvargs.a 00:01:56.044 [16/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:56.044 [17/719] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:56.044 [18/719] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:56.044 [19/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:56.044 [20/719] Linking static target lib/librte_pci.a 00:01:56.044 [21/719] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:56.044 [22/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:56.044 [23/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:56.044 [24/719] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:56.044 [25/719] Linking static target lib/librte_log.a 00:01:56.044 [26/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:56.044 [27/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:56.044 [28/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:56.307 [29/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:56.307 [30/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:56.307 [31/719] Compiling C object lib/librte_argparse.a.p/argparse_rte_argparse.c.o 00:01:56.307 [32/719] Linking static target lib/librte_argparse.a 00:01:56.307 [33/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:56.307 [34/719] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.307 [35/719] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.307 [36/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:56.573 [37/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:56.573 [38/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:56.573 [39/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:56.573 [40/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:56.573 [41/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:56.573 [42/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:56.573 [43/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:56.573 [44/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:56.573 [45/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:56.573 [46/719] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:56.573 [47/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:56.573 [48/719] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:56.573 [49/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:56.573 [50/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:56.573 [51/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:56.573 [52/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:56.573 [53/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:56.573 [54/719] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:56.573 [55/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:56.573 [56/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:56.573 [57/719] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:56.573 [58/719] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:56.574 [59/719] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:56.574 [60/719] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:56.574 [61/719] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:56.574 [62/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:56.574 [63/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:56.574 [64/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:56.574 [65/719] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:56.574 [66/719] Linking static target lib/librte_meter.a 00:01:56.574 [67/719] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:56.574 [68/719] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:56.574 [69/719] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:56.574 [70/719] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:56.574 [71/719] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:56.574 [72/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:56.574 [73/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:56.574 [74/719] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:56.574 [75/719] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:56.574 [76/719] Generating lib/argparse.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.574 [77/719] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:56.574 [78/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:56.574 [79/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:56.574 [80/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:56.574 [81/719] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:56.574 [82/719] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:56.574 [83/719] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:56.574 [84/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:56.574 [85/719] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:56.574 [86/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:56.574 [87/719] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:56.574 [88/719] Linking static target lib/librte_cmdline.a 00:01:56.574 [89/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:56.574 [90/719] Linking static target lib/librte_ring.a 00:01:56.574 [91/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:56.574 [92/719] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:56.841 [93/719] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:56.841 [94/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:56.841 [95/719] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:56.841 [96/719] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:56.841 [97/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:56.841 [98/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:56.841 [99/719] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:56.841 [100/719] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:56.841 [101/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:56.841 [102/719] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:56.841 [103/719] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:56.841 [104/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:56.841 [105/719] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:56.841 [106/719] Linking static target lib/librte_metrics.a 00:01:56.841 [107/719] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:56.841 [108/719] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:56.841 [109/719] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:56.841 [110/719] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:56.841 [111/719] Linking static target lib/librte_net.a 00:01:56.841 [112/719] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:56.841 [113/719] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:56.841 [114/719] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:56.841 [115/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:56.841 [116/719] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:56.841 [117/719] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:56.841 [118/719] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:56.841 [119/719] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:56.842 [120/719] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.842 [121/719] Linking static target lib/librte_cfgfile.a 00:01:56.842 [122/719] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:57.107 [123/719] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:57.107 [124/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:57.107 [125/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:57.107 [126/719] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.107 [127/719] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:57.107 [128/719] Linking target lib/librte_log.so.24.2 00:01:57.107 [129/719] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:57.107 [130/719] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:57.107 [131/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:57.107 [132/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:57.107 [133/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:57.107 [134/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:57.107 [135/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:57.107 [136/719] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:57.107 [137/719] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:57.107 [138/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:57.107 [139/719] Linking static target lib/librte_bitratestats.a 00:01:57.107 [140/719] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.107 [141/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:57.107 [142/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:57.107 [143/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:57.107 [144/719] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:57.107 [145/719] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:57.107 [146/719] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:57.107 [147/719] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:57.107 [148/719] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:57.107 [149/719] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:57.107 [150/719] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:57.107 [151/719] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.107 [152/719] Linking static target lib/librte_timer.a 00:01:57.107 [153/719] Generating symbol file lib/librte_log.so.24.2.p/librte_log.so.24.2.symbols 00:01:57.107 [154/719] Linking static target lib/librte_mempool.a 00:01:57.367 [155/719] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:57.368 [156/719] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:57.368 [157/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:57.368 [158/719] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:57.368 [159/719] Linking target lib/librte_argparse.so.24.2 00:01:57.368 [160/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:57.368 [161/719] Linking target lib/librte_kvargs.so.24.2 00:01:57.368 [162/719] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:57.368 [163/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:57.368 [164/719] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:57.368 [165/719] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:57.368 [166/719] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:57.368 [167/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:57.368 [168/719] Linking static target lib/librte_jobstats.a 00:01:57.368 [169/719] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:57.368 [170/719] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:57.368 [171/719] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:57.368 [172/719] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.368 [173/719] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:57.368 [174/719] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:57.368 [175/719] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:57.368 [176/719] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:57.368 [177/719] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:57.368 [178/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:57.368 [179/719] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:57.368 [180/719] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:57.368 [181/719] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:57.368 [182/719] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:57.368 [183/719] Linking static target lib/librte_bbdev.a 00:01:57.368 [184/719] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.368 [185/719] Linking static target lib/librte_compressdev.a 00:01:57.368 [186/719] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:57.368 [187/719] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:57.368 [188/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:57.368 [189/719] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:57.368 [190/719] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.629 [191/719] Generating symbol file lib/librte_kvargs.so.24.2.p/librte_kvargs.so.24.2.symbols 00:01:57.629 [192/719] Compiling C object lib/librte_port.a.p/port_port_log.c.o 00:01:57.629 [193/719] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:57.629 [194/719] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:57.629 [195/719] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:57.629 [196/719] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:57.629 [197/719] Linking static target lib/librte_dispatcher.a 00:01:57.629 [198/719] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:57.629 [199/719] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:57.629 [200/719] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:57.629 [201/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:57.629 [202/719] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:57.629 [203/719] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:57.629 [204/719] Linking static target lib/librte_latencystats.a 00:01:57.629 [205/719] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:57.629 [206/719] Linking static target lib/librte_rcu.a 00:01:57.629 [207/719] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:57.629 [208/719] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:57.629 [209/719] Linking static target lib/librte_telemetry.a 00:01:57.629 [210/719] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:57.629 [211/719] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:57.629 [212/719] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:57.629 [213/719] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:57.629 [214/719] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:57.629 [215/719] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:57.629 [216/719] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:57.629 [217/719] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:57.629 [218/719] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:57.629 [219/719] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:57.629 [220/719] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:57.629 [221/719] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:57.629 [222/719] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:57.629 [223/719] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:57.629 [224/719] Linking static target lib/librte_eal.a 00:01:57.629 [225/719] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:57.629 [226/719] Linking static target lib/librte_gro.a 00:01:57.629 [227/719] Linking static target lib/librte_gpudev.a 00:01:57.629 [228/719] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:57.629 [229/719] Linking static target lib/librte_dmadev.a 00:01:57.629 [230/719] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:57.629 [231/719] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:57.629 [232/719] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:57.629 [233/719] Linking static target lib/librte_stack.a 00:01:57.629 [234/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:57.629 [235/719] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:57.629 [236/719] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.896 [237/719] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:57.896 [238/719] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:57.896 [239/719] Linking static target lib/librte_distributor.a 00:01:57.896 [240/719] Linking static target lib/librte_gso.a 00:01:57.896 [241/719] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:57.896 [242/719] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:57.896 [243/719] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.896 [244/719] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:57.896 [245/719] Linking static target lib/librte_mbuf.a 00:01:57.896 [246/719] Linking static target lib/librte_regexdev.a 00:01:57.896 [247/719] Compiling C object lib/librte_table.a.p/table_table_log.c.o 00:01:57.896 [248/719] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:57.896 [249/719] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:57.896 [250/719] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:57.896 [251/719] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:57.896 [252/719] Linking static target lib/librte_ip_frag.a 00:01:57.896 [253/719] Linking static target lib/librte_rawdev.a 00:01:57.896 [254/719] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:57.896 [255/719] Linking static target lib/librte_pcapng.a 00:01:57.896 [256/719] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:57.896 [257/719] Linking static target lib/librte_power.a 00:01:57.896 [258/719] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:57.896 [259/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:57.896 [260/719] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:57.896 [261/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:57.896 [262/719] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:57.896 [263/719] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:58.154 [264/719] Linking static target lib/librte_mldev.a 00:01:58.154 [265/719] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:58.154 [266/719] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:58.154 [267/719] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [268/719] Linking static target lib/librte_reorder.a 00:01:58.154 [269/719] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:58.154 [270/719] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:58.154 [271/719] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:58.154 [272/719] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [273/719] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:58.154 [274/719] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:58.154 [275/719] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [276/719] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [277/719] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [278/719] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:58.154 [279/719] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:58.154 [280/719] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [281/719] Linking static target lib/librte_security.a 00:01:58.154 [282/719] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:58.154 [283/719] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:58.154 [284/719] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:58.154 [285/719] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:58.154 [286/719] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:58.154 [287/719] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:58.154 [288/719] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.154 [289/719] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:58.154 [290/719] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:58.418 [291/719] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:58.418 [292/719] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:58.418 [293/719] Linking static target lib/librte_bpf.a 00:01:58.418 [294/719] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [295/719] Linking static target lib/librte_lpm.a 00:01:58.418 [296/719] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:58.418 [297/719] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:58.418 [298/719] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [299/719] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:58.418 [300/719] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:58.418 [301/719] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [302/719] Linking static target lib/librte_rib.a 00:01:58.418 [303/719] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [304/719] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:58.418 [305/719] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [306/719] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [307/719] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:58.418 [308/719] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:58.418 [309/719] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:58.418 [310/719] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:58.418 [311/719] Linking target lib/librte_telemetry.so.24.2 00:01:58.418 [312/719] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:58.418 [313/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:58.418 [314/719] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:58.418 [315/719] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:58.418 [316/719] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.418 [317/719] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:58.418 [318/719] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:58.680 [319/719] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.680 [320/719] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:58.680 [321/719] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:58.680 [322/719] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:58.680 [323/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:58.680 [324/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:58.680 [325/719] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.680 [326/719] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:58.680 [327/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:58.680 [328/719] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:58.680 [329/719] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:58.680 [330/719] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:58.680 [331/719] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:58.680 [332/719] Generating symbol file lib/librte_telemetry.so.24.2.p/librte_telemetry.so.24.2.symbols 00:01:58.680 [333/719] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.681 [334/719] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:58.681 [335/719] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:58.681 [336/719] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:58.681 [337/719] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:58.681 [338/719] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:58.681 [339/719] Linking static target lib/librte_efd.a 00:01:58.681 [340/719] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:58.681 [341/719] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:58.681 [342/719] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.681 [343/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:58.943 [344/719] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:58.943 [345/719] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:58.943 [346/719] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:58.943 [347/719] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:58.943 [348/719] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:58.943 [349/719] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:58.943 [350/719] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.943 [351/719] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:58.943 [352/719] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.943 [353/719] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:58.943 [354/719] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:58.943 [355/719] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:58.943 [356/719] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.943 [357/719] Generating app/graph/commands_hdr with a custom command (wrapped by meson to capture output) 00:01:58.943 [358/719] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:58.943 [359/719] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:58.943 [360/719] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:58.943 [361/719] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:58.943 [362/719] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:58.943 [363/719] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:58.943 [364/719] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:58.943 [365/719] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:58.943 [366/719] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:58.943 [367/719] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:58.943 [368/719] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.943 [369/719] Linking static target lib/librte_fib.a 00:01:58.943 [370/719] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:59.207 [371/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:59.207 [372/719] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.207 [373/719] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.207 [374/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:59.207 [375/719] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:59.207 [376/719] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:59.207 [377/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:59.207 [378/719] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.207 [379/719] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:59.207 [380/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:59.207 [381/719] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:59.207 [382/719] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.207 [383/719] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:59.207 [384/719] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:59.207 [385/719] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:59.207 [386/719] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:59.207 [387/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:59.207 [388/719] Linking static target lib/librte_pdump.a 00:01:59.207 [389/719] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:59.207 [390/719] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:59.207 [391/719] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:59.207 [392/719] Linking static target lib/librte_graph.a 00:01:59.207 [393/719] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:59.207 [394/719] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:59.474 [395/719] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:59.474 [396/719] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:59.474 [397/719] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:59.474 [398/719] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:59.474 [399/719] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:59.474 [400/719] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:59.474 [401/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:59.474 [402/719] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:59.474 [403/719] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:59.474 [404/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:59.474 [405/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:59.474 [406/719] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:59.474 [407/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:59.474 [408/719] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:59.740 [409/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:59.740 [410/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:59.740 [411/719] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.740 [412/719] Linking static target lib/librte_table.a 00:01:59.740 [413/719] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:59.740 [414/719] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:59.740 [415/719] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:59.740 [416/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:59.740 [417/719] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:59.740 [418/719] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:59.740 [419/719] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:59.740 [420/719] Linking static target lib/librte_sched.a 00:01:59.740 [421/719] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:59.740 [422/719] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:59.740 [423/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:59.740 [424/719] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:59.740 [425/719] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.740 [426/719] Linking static target drivers/librte_bus_vdev.a 00:01:59.740 [427/719] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:59.740 [428/719] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:59.740 [429/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:59.740 [430/719] Compiling C object drivers/librte_bus_vdev.so.24.2.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:59.740 [431/719] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:59.740 [432/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:59.740 [433/719] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:59.740 [434/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:59.740 [435/719] Compiling C object app/dpdk-graph.p/graph_l2fwd.c.o 00:01:59.740 [436/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:59.740 [437/719] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:59.740 [438/719] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:59.740 [439/719] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:59.740 [440/719] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:59.740 [441/719] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:59.740 [442/719] Linking static target lib/librte_cryptodev.a 00:01:59.740 [443/719] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:59.999 [444/719] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:59.999 [445/719] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:59.999 [446/719] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:59.999 [447/719] Linking static target drivers/librte_bus_pci.a 00:01:59.999 [448/719] Compiling C object drivers/librte_bus_pci.so.24.2.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:59.999 [449/719] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:59.999 [450/719] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:59.999 [451/719] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:59.999 [452/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:59.999 [453/719] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:59.999 [454/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:59.999 [455/719] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:59.999 [456/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:59.999 [457/719] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:59.999 [458/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:59.999 [459/719] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:59.999 [460/719] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:59.999 [461/719] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:59.999 [462/719] Linking static target lib/librte_ipsec.a 00:01:59.999 [463/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:59.999 [464/719] Linking static target lib/librte_member.a 00:01:59.999 [465/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:59.999 [466/719] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:59.999 [467/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:59.999 [468/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:59.999 [469/719] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:59.999 [470/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:59.999 [471/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:59.999 [472/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:00.261 [473/719] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:00.261 [474/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:00.261 [475/719] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.261 [476/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:00.261 [477/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:00.261 [478/719] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:00.261 [479/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:00.261 [480/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:00.261 [481/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:00.261 [482/719] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:00.261 [483/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:00.261 [484/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:00.261 [485/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:00.261 [486/719] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.261 [487/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:00.261 [488/719] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:00.261 [489/719] Linking static target lib/librte_pdcp.a 00:02:00.261 [490/719] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:00.261 [491/719] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:00.261 [492/719] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:00.261 [493/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:00.261 [494/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:00.261 [495/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:00.261 [496/719] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.261 [497/719] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:00.261 [498/719] Linking static target lib/librte_node.a 00:02:00.261 [499/719] Compiling C object drivers/librte_mempool_ring.so.24.2.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:00.261 [500/719] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:00.261 [501/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:00.519 [502/719] Linking static target drivers/librte_mempool_ring.a 00:02:00.519 [503/719] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:00.519 [504/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:00.519 [505/719] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:00.519 [506/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:00.519 [507/719] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:00.519 [508/719] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:00.519 [509/719] Linking static target lib/librte_hash.a 00:02:00.519 [510/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:00.519 [511/719] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:00.519 [512/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:00.519 [513/719] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.519 [514/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:00.520 [515/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:00.520 [516/719] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.520 [517/719] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.520 [518/719] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:00.520 [519/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:00.520 [520/719] Linking static target lib/librte_port.a 00:02:00.520 [521/719] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:00.520 [522/719] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:00.520 [523/719] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:00.520 [524/719] Linking static target lib/acl/libavx2_tmp.a 00:02:00.520 [525/719] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.520 [526/719] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:00.520 [527/719] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:00.520 [528/719] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:00.520 [529/719] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:00.520 [530/719] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.520 [531/719] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:00.520 [532/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:00.520 [533/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:00.520 [534/719] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:00.520 [535/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:00.520 [536/719] Linking static target lib/librte_eventdev.a 00:02:00.779 [537/719] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:00.779 [538/719] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:00.779 [539/719] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:00.779 [540/719] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:00.779 [541/719] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:00.779 [542/719] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:00.779 [543/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:00.779 [544/719] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.779 [545/719] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:00.779 [546/719] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:00.779 [547/719] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:00.779 [548/719] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.779 [549/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:00.779 [550/719] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:00.779 [551/719] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:00.779 [552/719] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:00.779 [553/719] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:00.779 [554/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:00.779 [555/719] Linking static target lib/librte_acl.a 00:02:00.779 [556/719] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:00.779 [557/719] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:00.779 [558/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:00.779 [559/719] Compiling C object app/dpdk-test-security-perf.p/test_test_security_proto.c.o 00:02:00.779 [560/719] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:01.037 [561/719] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:01.037 [562/719] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:01.037 [563/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:01.037 [564/719] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:01.037 [565/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:01.037 [566/719] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:01.037 [567/719] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:01.037 [568/719] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:01.037 [569/719] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:01.037 [570/719] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:01.037 [571/719] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:01.037 [572/719] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:01.295 [573/719] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:01.295 [574/719] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:01.295 [575/719] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:01.295 [576/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:01.295 [577/719] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.295 [578/719] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.295 [579/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:01.295 [580/719] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.553 [581/719] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:01.553 [582/719] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:01.553 [583/719] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:01.553 [584/719] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.812 [585/719] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:01.812 [586/719] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:01.812 [587/719] Linking static target lib/librte_ethdev.a 00:02:02.070 [588/719] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:02.070 [589/719] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:02.330 [590/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:02.589 [591/719] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:02.589 [592/719] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:03.157 [593/719] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:03.157 [594/719] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:03.157 [595/719] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:03.416 [596/719] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:03.416 [597/719] Compiling C object drivers/librte_net_i40e.so.24.2.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:03.416 [598/719] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:03.674 [599/719] Linking static target drivers/librte_net_i40e.a 00:02:03.933 [600/719] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:04.191 [601/719] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.759 [602/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:04.759 [603/719] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.759 [604/719] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:10.036 [605/719] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.036 [606/719] Linking target lib/librte_eal.so.24.2 00:02:10.036 [607/719] Generating symbol file lib/librte_eal.so.24.2.p/librte_eal.so.24.2.symbols 00:02:10.036 [608/719] Linking target lib/librte_timer.so.24.2 00:02:10.036 [609/719] Linking target lib/librte_dmadev.so.24.2 00:02:10.036 [610/719] Linking target lib/librte_cfgfile.so.24.2 00:02:10.036 [611/719] Linking target lib/librte_rawdev.so.24.2 00:02:10.036 [612/719] Linking target lib/librte_meter.so.24.2 00:02:10.036 [613/719] Linking target lib/librte_ring.so.24.2 00:02:10.036 [614/719] Linking target drivers/librte_bus_vdev.so.24.2 00:02:10.036 [615/719] Linking target lib/librte_pci.so.24.2 00:02:10.036 [616/719] Linking target lib/librte_jobstats.so.24.2 00:02:10.036 [617/719] Linking target lib/librte_stack.so.24.2 00:02:10.036 [618/719] Linking target lib/librte_acl.so.24.2 00:02:10.036 [619/719] Generating symbol file lib/librte_timer.so.24.2.p/librte_timer.so.24.2.symbols 00:02:10.036 [620/719] Generating symbol file lib/librte_meter.so.24.2.p/librte_meter.so.24.2.symbols 00:02:10.036 [621/719] Generating symbol file drivers/librte_bus_vdev.so.24.2.p/librte_bus_vdev.so.24.2.symbols 00:02:10.036 [622/719] Generating symbol file lib/librte_dmadev.so.24.2.p/librte_dmadev.so.24.2.symbols 00:02:10.036 [623/719] Generating symbol file lib/librte_ring.so.24.2.p/librte_ring.so.24.2.symbols 00:02:10.036 [624/719] Generating symbol file lib/librte_pci.so.24.2.p/librte_pci.so.24.2.symbols 00:02:10.036 [625/719] Generating symbol file lib/librte_acl.so.24.2.p/librte_acl.so.24.2.symbols 00:02:10.036 [626/719] Linking target drivers/librte_bus_pci.so.24.2 00:02:10.036 [627/719] Linking target lib/librte_rcu.so.24.2 00:02:10.036 [628/719] Linking target lib/librte_mempool.so.24.2 00:02:10.036 [629/719] Generating symbol file lib/librte_mempool.so.24.2.p/librte_mempool.so.24.2.symbols 00:02:10.036 [630/719] Generating symbol file drivers/librte_bus_pci.so.24.2.p/librte_bus_pci.so.24.2.symbols 00:02:10.036 [631/719] Generating symbol file lib/librte_rcu.so.24.2.p/librte_rcu.so.24.2.symbols 00:02:10.036 [632/719] Linking target drivers/librte_mempool_ring.so.24.2 00:02:10.036 [633/719] Linking target lib/librte_rib.so.24.2 00:02:10.036 [634/719] Linking target lib/librte_mbuf.so.24.2 00:02:10.036 [635/719] Generating symbol file lib/librte_rib.so.24.2.p/librte_rib.so.24.2.symbols 00:02:10.036 [636/719] Generating symbol file lib/librte_mbuf.so.24.2.p/librte_mbuf.so.24.2.symbols 00:02:10.295 [637/719] Linking target lib/librte_fib.so.24.2 00:02:10.295 [638/719] Linking target lib/librte_distributor.so.24.2 00:02:10.295 [639/719] Linking target lib/librte_reorder.so.24.2 00:02:10.295 [640/719] Linking target lib/librte_net.so.24.2 00:02:10.295 [641/719] Linking target lib/librte_bbdev.so.24.2 00:02:10.295 [642/719] Linking target lib/librte_compressdev.so.24.2 00:02:10.295 [643/719] Linking target lib/librte_gpudev.so.24.2 00:02:10.295 [644/719] Linking target lib/librte_regexdev.so.24.2 00:02:10.295 [645/719] Linking target lib/librte_cryptodev.so.24.2 00:02:10.295 [646/719] Linking target lib/librte_sched.so.24.2 00:02:10.295 [647/719] Linking target lib/librte_mldev.so.24.2 00:02:10.295 [648/719] Generating symbol file lib/librte_reorder.so.24.2.p/librte_reorder.so.24.2.symbols 00:02:10.295 [649/719] Generating symbol file lib/librte_cryptodev.so.24.2.p/librte_cryptodev.so.24.2.symbols 00:02:10.295 [650/719] Generating symbol file lib/librte_net.so.24.2.p/librte_net.so.24.2.symbols 00:02:10.295 [651/719] Generating symbol file lib/librte_sched.so.24.2.p/librte_sched.so.24.2.symbols 00:02:10.295 [652/719] Linking target lib/librte_hash.so.24.2 00:02:10.295 [653/719] Linking target lib/librte_security.so.24.2 00:02:10.295 [654/719] Linking target lib/librte_cmdline.so.24.2 00:02:10.554 [655/719] Generating symbol file lib/librte_hash.so.24.2.p/librte_hash.so.24.2.symbols 00:02:10.554 [656/719] Generating symbol file lib/librte_security.so.24.2.p/librte_security.so.24.2.symbols 00:02:10.554 [657/719] Linking target lib/librte_lpm.so.24.2 00:02:10.554 [658/719] Linking target lib/librte_efd.so.24.2 00:02:10.554 [659/719] Linking target lib/librte_member.so.24.2 00:02:10.554 [660/719] Linking target lib/librte_pdcp.so.24.2 00:02:10.554 [661/719] Linking target lib/librte_ipsec.so.24.2 00:02:10.554 [662/719] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.813 [663/719] Linking target lib/librte_ethdev.so.24.2 00:02:10.813 [664/719] Generating symbol file lib/librte_lpm.so.24.2.p/librte_lpm.so.24.2.symbols 00:02:10.813 [665/719] Generating symbol file lib/librte_ipsec.so.24.2.p/librte_ipsec.so.24.2.symbols 00:02:10.813 [666/719] Generating symbol file lib/librte_ethdev.so.24.2.p/librte_ethdev.so.24.2.symbols 00:02:10.813 [667/719] Linking target lib/librte_metrics.so.24.2 00:02:10.813 [668/719] Linking target lib/librte_gso.so.24.2 00:02:10.813 [669/719] Linking target lib/librte_ip_frag.so.24.2 00:02:10.813 [670/719] Linking target lib/librte_gro.so.24.2 00:02:10.813 [671/719] Linking target lib/librte_pcapng.so.24.2 00:02:10.813 [672/719] Linking target lib/librte_bpf.so.24.2 00:02:10.813 [673/719] Linking target lib/librte_power.so.24.2 00:02:10.813 [674/719] Linking target lib/librte_eventdev.so.24.2 00:02:10.813 [675/719] Linking target drivers/librte_net_i40e.so.24.2 00:02:11.071 [676/719] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:11.071 [677/719] Linking static target lib/librte_pipeline.a 00:02:11.071 [678/719] Generating symbol file lib/librte_ip_frag.so.24.2.p/librte_ip_frag.so.24.2.symbols 00:02:11.071 [679/719] Generating symbol file lib/librte_bpf.so.24.2.p/librte_bpf.so.24.2.symbols 00:02:11.071 [680/719] Generating symbol file lib/librte_metrics.so.24.2.p/librte_metrics.so.24.2.symbols 00:02:11.071 [681/719] Generating symbol file lib/librte_eventdev.so.24.2.p/librte_eventdev.so.24.2.symbols 00:02:11.071 [682/719] Generating symbol file lib/librte_pcapng.so.24.2.p/librte_pcapng.so.24.2.symbols 00:02:11.071 [683/719] Linking target lib/librte_latencystats.so.24.2 00:02:11.071 [684/719] Linking target lib/librte_bitratestats.so.24.2 00:02:11.071 [685/719] Linking target lib/librte_dispatcher.so.24.2 00:02:11.071 [686/719] Linking target lib/librte_port.so.24.2 00:02:11.071 [687/719] Linking target lib/librte_pdump.so.24.2 00:02:11.071 [688/719] Linking target lib/librte_graph.so.24.2 00:02:11.329 [689/719] Generating symbol file lib/librte_port.so.24.2.p/librte_port.so.24.2.symbols 00:02:11.329 [690/719] Generating symbol file lib/librte_graph.so.24.2.p/librte_graph.so.24.2.symbols 00:02:11.329 [691/719] Linking target lib/librte_table.so.24.2 00:02:11.329 [692/719] Linking target lib/librte_node.so.24.2 00:02:11.329 [693/719] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:11.329 [694/719] Linking static target lib/librte_vhost.a 00:02:11.329 [695/719] Generating symbol file lib/librte_table.so.24.2.p/librte_table.so.24.2.symbols 00:02:11.895 [696/719] Linking target app/dpdk-dumpcap 00:02:11.895 [697/719] Linking target app/dpdk-proc-info 00:02:11.895 [698/719] Linking target app/dpdk-test-gpudev 00:02:11.895 [699/719] Linking target app/dpdk-test-sad 00:02:11.895 [700/719] Linking target app/dpdk-test-bbdev 00:02:11.895 [701/719] Linking target app/dpdk-pdump 00:02:11.895 [702/719] Linking target app/dpdk-test-acl 00:02:11.895 [703/719] Linking target app/dpdk-graph 00:02:11.895 [704/719] Linking target app/dpdk-test-crypto-perf 00:02:11.895 [705/719] Linking target app/dpdk-test-flow-perf 00:02:11.895 [706/719] Linking target app/dpdk-test-compress-perf 00:02:11.895 [707/719] Linking target app/dpdk-test-eventdev 00:02:11.895 [708/719] Linking target app/dpdk-test-cmdline 00:02:11.895 [709/719] Linking target app/dpdk-test-fib 00:02:11.895 [710/719] Linking target app/dpdk-test-dma-perf 00:02:11.895 [711/719] Linking target app/dpdk-test-mldev 00:02:11.895 [712/719] Linking target app/dpdk-test-security-perf 00:02:11.895 [713/719] Linking target app/dpdk-test-regex 00:02:11.895 [714/719] Linking target app/dpdk-test-pipeline 00:02:11.895 [715/719] Linking target app/dpdk-testpmd 00:02:13.800 [716/719] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.800 [717/719] Linking target lib/librte_vhost.so.24.2 00:02:16.335 [718/719] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.593 [719/719] Linking target lib/librte_pipeline.so.24.2 00:02:16.593 20:49:32 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:16.593 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:16.593 [0/1] Installing files. 00:02:16.860 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.860 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.861 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.862 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.863 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.864 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.865 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:16.866 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:16.866 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_log.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_kvargs.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_argparse.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_argparse.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_telemetry.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_eal.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_ring.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_rcu.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_mempool.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_mbuf.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_net.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_meter.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_ethdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_pci.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_cmdline.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_metrics.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_hash.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_timer.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_acl.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_bbdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_bitratestats.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_bpf.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_cfgfile.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_compressdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_cryptodev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_distributor.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_dmadev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.866 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_efd.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_eventdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_dispatcher.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_gpudev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_gro.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_gso.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_ip_frag.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_jobstats.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.867 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_latencystats.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_lpm.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_member.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pcapng.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_power.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_rawdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_regexdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_mldev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_rib.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_reorder.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_sched.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_security.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_stack.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_vhost.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_ipsec.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pdcp.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_fib.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_port.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pdump.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_table.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_pipeline.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_graph.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing lib/librte_node.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing drivers/librte_bus_pci.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2 00:02:17.131 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing drivers/librte_bus_vdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2 00:02:17.131 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing drivers/librte_mempool_ring.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2 00:02:17.131 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.131 Installing drivers/librte_net_i40e.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2 00:02:17.131 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/argparse/rte_argparse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.131 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.132 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:17.133 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:17.133 Installing symlink pointing to librte_log.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:17.133 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:17.133 Installing symlink pointing to librte_kvargs.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:17.133 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:17.133 Installing symlink pointing to librte_argparse.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so.24 00:02:17.133 Installing symlink pointing to librte_argparse.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so 00:02:17.133 Installing symlink pointing to librte_telemetry.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:17.133 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:17.133 Installing symlink pointing to librte_eal.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:17.133 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:17.133 Installing symlink pointing to librte_ring.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:17.133 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:17.133 Installing symlink pointing to librte_rcu.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:17.133 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:17.133 Installing symlink pointing to librte_mempool.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:17.133 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:17.133 Installing symlink pointing to librte_mbuf.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:17.133 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:17.133 Installing symlink pointing to librte_net.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:17.133 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:17.133 Installing symlink pointing to librte_meter.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:17.133 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:17.133 Installing symlink pointing to librte_ethdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:17.133 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:17.134 Installing symlink pointing to librte_pci.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:17.134 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:17.134 Installing symlink pointing to librte_cmdline.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:17.134 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:17.134 Installing symlink pointing to librte_metrics.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:17.134 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:17.134 Installing symlink pointing to librte_hash.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:17.134 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:17.134 Installing symlink pointing to librte_timer.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:17.134 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:17.134 Installing symlink pointing to librte_acl.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:17.134 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:17.134 Installing symlink pointing to librte_bbdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:17.134 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:17.134 Installing symlink pointing to librte_bitratestats.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:17.134 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:17.134 Installing symlink pointing to librte_bpf.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:17.134 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:17.134 Installing symlink pointing to librte_cfgfile.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:17.134 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:17.134 Installing symlink pointing to librte_compressdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:17.134 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:17.134 Installing symlink pointing to librte_cryptodev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:17.134 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:17.134 Installing symlink pointing to librte_distributor.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:17.134 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:17.134 Installing symlink pointing to librte_dmadev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:17.134 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:17.134 Installing symlink pointing to librte_efd.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:17.134 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:17.134 Installing symlink pointing to librte_eventdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:17.134 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:17.134 Installing symlink pointing to librte_dispatcher.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:17.134 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:17.134 Installing symlink pointing to librte_gpudev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:17.134 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:17.134 Installing symlink pointing to librte_gro.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:17.134 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:17.134 Installing symlink pointing to librte_gso.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:17.134 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:17.134 Installing symlink pointing to librte_ip_frag.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:17.134 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:17.134 Installing symlink pointing to librte_jobstats.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:17.134 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:17.134 Installing symlink pointing to librte_latencystats.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:17.134 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:17.134 Installing symlink pointing to librte_lpm.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:17.134 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:17.134 Installing symlink pointing to librte_member.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:17.134 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:17.134 Installing symlink pointing to librte_pcapng.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:17.134 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:17.134 Installing symlink pointing to librte_power.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:17.134 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:17.134 Installing symlink pointing to librte_rawdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:17.134 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:17.134 Installing symlink pointing to librte_regexdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:17.134 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:17.134 Installing symlink pointing to librte_mldev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:17.134 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:17.134 Installing symlink pointing to librte_rib.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:17.134 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:17.134 Installing symlink pointing to librte_reorder.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:17.134 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:17.134 Installing symlink pointing to librte_sched.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:17.134 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:17.134 Installing symlink pointing to librte_security.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:17.134 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:17.134 Installing symlink pointing to librte_stack.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:17.134 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:17.134 Installing symlink pointing to librte_vhost.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:17.134 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:17.134 Installing symlink pointing to librte_ipsec.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:17.134 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:17.134 Installing symlink pointing to librte_pdcp.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:17.134 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:17.134 Installing symlink pointing to librte_fib.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:17.134 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:17.134 Installing symlink pointing to librte_port.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:17.134 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:17.134 Installing symlink pointing to librte_pdump.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:17.134 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:17.134 Installing symlink pointing to librte_table.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:17.134 './librte_bus_pci.so' -> 'dpdk/pmds-24.2/librte_bus_pci.so' 00:02:17.134 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.2/librte_bus_pci.so.24' 00:02:17.134 './librte_bus_pci.so.24.2' -> 'dpdk/pmds-24.2/librte_bus_pci.so.24.2' 00:02:17.134 './librte_bus_vdev.so' -> 'dpdk/pmds-24.2/librte_bus_vdev.so' 00:02:17.134 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.2/librte_bus_vdev.so.24' 00:02:17.134 './librte_bus_vdev.so.24.2' -> 'dpdk/pmds-24.2/librte_bus_vdev.so.24.2' 00:02:17.134 './librte_mempool_ring.so' -> 'dpdk/pmds-24.2/librte_mempool_ring.so' 00:02:17.134 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.2/librte_mempool_ring.so.24' 00:02:17.134 './librte_mempool_ring.so.24.2' -> 'dpdk/pmds-24.2/librte_mempool_ring.so.24.2' 00:02:17.134 './librte_net_i40e.so' -> 'dpdk/pmds-24.2/librte_net_i40e.so' 00:02:17.134 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.2/librte_net_i40e.so.24' 00:02:17.134 './librte_net_i40e.so.24.2' -> 'dpdk/pmds-24.2/librte_net_i40e.so.24.2' 00:02:17.134 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:17.134 Installing symlink pointing to librte_pipeline.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:17.134 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:17.134 Installing symlink pointing to librte_graph.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:17.134 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:17.134 Installing symlink pointing to librte_node.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:17.134 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:17.134 Installing symlink pointing to librte_bus_pci.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_bus_pci.so.24 00:02:17.134 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_bus_pci.so 00:02:17.134 Installing symlink pointing to librte_bus_vdev.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_bus_vdev.so.24 00:02:17.134 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_bus_vdev.so 00:02:17.134 Installing symlink pointing to librte_mempool_ring.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_mempool_ring.so.24 00:02:17.134 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_mempool_ring.so 00:02:17.134 Installing symlink pointing to librte_net_i40e.so.24.2 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_net_i40e.so.24 00:02:17.134 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.2/librte_net_i40e.so 00:02:17.134 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.2' 00:02:17.134 20:49:32 -- common/autobuild_common.sh@189 -- $ uname -s 00:02:17.134 20:49:32 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:17.134 20:49:32 -- common/autobuild_common.sh@200 -- $ cat 00:02:17.134 20:49:32 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:17.134 00:02:17.134 real 0m27.577s 00:02:17.134 user 8m11.230s 00:02:17.134 sys 2m35.620s 00:02:17.134 20:49:32 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:02:17.134 20:49:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:17.134 ************************************ 00:02:17.134 END TEST build_native_dpdk 00:02:17.134 ************************************ 00:02:17.400 20:49:32 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:17.400 20:49:32 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:17.400 20:49:32 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:17.400 20:49:32 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:17.400 20:49:32 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:17.400 20:49:32 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:02:17.400 20:49:32 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:17.400 20:49:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:17.400 ************************************ 00:02:17.400 START TEST autobuild_llvm_precompile 00:02:17.400 ************************************ 00:02:17.400 20:49:32 -- common/autotest_common.sh@1111 -- $ _llvm_precompile 00:02:17.400 20:49:32 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:17.400 20:49:33 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:02:17.400 Target: x86_64-redhat-linux-gnu 00:02:17.400 Thread model: posix 00:02:17.400 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:17.400 20:49:33 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:02:17.400 20:49:33 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:02:17.400 20:49:33 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:02:17.400 20:49:33 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:02:17.400 20:49:33 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:02:17.400 20:49:33 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:17.400 20:49:33 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:17.400 20:49:33 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:02:17.400 20:49:33 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:02:17.400 20:49:33 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:17.667 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:17.942 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:17.942 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:17.942 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:18.515 Using 'verbs' RDMA provider 00:02:34.344 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:46.671 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:46.671 Creating mk/config.mk...done. 00:02:46.671 Creating mk/cc.flags.mk...done. 00:02:46.671 Type 'make' to build. 00:02:46.671 00:02:46.671 real 0m28.902s 00:02:46.671 user 0m12.389s 00:02:46.671 sys 0m15.697s 00:02:46.671 20:50:01 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:02:46.671 20:50:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:46.671 ************************************ 00:02:46.671 END TEST autobuild_llvm_precompile 00:02:46.671 ************************************ 00:02:46.671 20:50:01 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:46.671 20:50:01 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:46.671 20:50:01 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:46.671 20:50:01 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:46.671 20:50:01 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:46.671 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:46.671 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:46.671 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:46.931 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:47.190 Using 'verbs' RDMA provider 00:03:00.341 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:12.554 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:12.554 Creating mk/config.mk...done. 00:03:12.554 Creating mk/cc.flags.mk...done. 00:03:12.554 Type 'make' to build. 00:03:12.554 20:50:26 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:03:12.554 20:50:26 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:03:12.554 20:50:26 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:03:12.554 20:50:26 -- common/autotest_common.sh@10 -- $ set +x 00:03:12.554 ************************************ 00:03:12.554 START TEST make 00:03:12.554 ************************************ 00:03:12.554 20:50:26 -- common/autotest_common.sh@1111 -- $ make -j112 00:03:12.554 make[1]: Nothing to be done for 'all'. 00:03:13.122 The Meson build system 00:03:13.122 Version: 1.3.1 00:03:13.122 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:13.122 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:13.122 Build type: native build 00:03:13.122 Project name: libvfio-user 00:03:13.122 Project version: 0.0.1 00:03:13.122 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:03:13.122 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:03:13.122 Host machine cpu family: x86_64 00:03:13.122 Host machine cpu: x86_64 00:03:13.122 Run-time dependency threads found: YES 00:03:13.122 Library dl found: YES 00:03:13.122 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:13.122 Run-time dependency json-c found: YES 0.17 00:03:13.122 Run-time dependency cmocka found: YES 1.1.7 00:03:13.122 Program pytest-3 found: NO 00:03:13.122 Program flake8 found: NO 00:03:13.122 Program misspell-fixer found: NO 00:03:13.122 Program restructuredtext-lint found: NO 00:03:13.122 Program valgrind found: YES (/usr/bin/valgrind) 00:03:13.122 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:13.122 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:13.122 Compiler for C supports arguments -Wwrite-strings: YES 00:03:13.122 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:13.122 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:13.122 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:13.122 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:13.122 Build targets in project: 8 00:03:13.122 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:13.122 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:13.122 00:03:13.122 libvfio-user 0.0.1 00:03:13.122 00:03:13.122 User defined options 00:03:13.122 buildtype : debug 00:03:13.122 default_library: static 00:03:13.122 libdir : /usr/local/lib 00:03:13.122 00:03:13.122 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:13.381 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:13.640 [1/36] Compiling C object samples/null.p/null.c.o 00:03:13.640 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:13.640 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:13.640 [4/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:13.640 [5/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:13.640 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:13.640 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:13.640 [8/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:13.640 [9/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:13.640 [10/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:13.640 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:13.640 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:13.640 [13/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:13.640 [14/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:13.640 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:13.640 [16/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:13.640 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:13.640 [18/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:13.640 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:13.640 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:13.640 [21/36] Compiling C object samples/server.p/server.c.o 00:03:13.640 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:13.640 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:13.640 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:13.640 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:13.640 [26/36] Compiling C object samples/client.p/client.c.o 00:03:13.640 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:13.640 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:13.640 [29/36] Linking static target lib/libvfio-user.a 00:03:13.640 [30/36] Linking target samples/client 00:03:13.640 [31/36] Linking target test/unit_tests 00:03:13.640 [32/36] Linking target samples/null 00:03:13.640 [33/36] Linking target samples/gpio-pci-idio-16 00:03:13.640 [34/36] Linking target samples/server 00:03:13.640 [35/36] Linking target samples/shadow_ioeventfd_server 00:03:13.640 [36/36] Linking target samples/lspci 00:03:13.640 INFO: autodetecting backend as ninja 00:03:13.640 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:13.640 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:14.208 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:14.208 ninja: no work to do. 00:03:17.492 CC lib/log/log.o 00:03:17.492 CC lib/log/log_deprecated.o 00:03:17.492 CC lib/log/log_flags.o 00:03:17.492 CC lib/ut_mock/mock.o 00:03:17.492 CC lib/ut/ut.o 00:03:17.492 LIB libspdk_log.a 00:03:17.492 LIB libspdk_ut_mock.a 00:03:17.492 LIB libspdk_ut.a 00:03:17.492 CC lib/dma/dma.o 00:03:17.492 CXX lib/trace_parser/trace.o 00:03:17.492 CC lib/util/base64.o 00:03:17.492 CC lib/util/bit_array.o 00:03:17.492 CC lib/ioat/ioat.o 00:03:17.492 CC lib/util/cpuset.o 00:03:17.492 CC lib/util/crc16.o 00:03:17.492 CC lib/util/crc32.o 00:03:17.492 CC lib/util/crc32c.o 00:03:17.492 CC lib/util/crc32_ieee.o 00:03:17.492 CC lib/util/fd.o 00:03:17.492 CC lib/util/crc64.o 00:03:17.492 CC lib/util/dif.o 00:03:17.492 CC lib/util/file.o 00:03:17.492 CC lib/util/hexlify.o 00:03:17.492 CC lib/util/iov.o 00:03:17.492 CC lib/util/math.o 00:03:17.492 CC lib/util/pipe.o 00:03:17.492 CC lib/util/strerror_tls.o 00:03:17.492 CC lib/util/fd_group.o 00:03:17.492 CC lib/util/string.o 00:03:17.492 CC lib/util/uuid.o 00:03:17.492 CC lib/util/xor.o 00:03:17.492 CC lib/util/zipf.o 00:03:17.751 LIB libspdk_dma.a 00:03:17.751 CC lib/vfio_user/host/vfio_user_pci.o 00:03:17.751 CC lib/vfio_user/host/vfio_user.o 00:03:17.751 LIB libspdk_ioat.a 00:03:18.009 LIB libspdk_vfio_user.a 00:03:18.009 LIB libspdk_util.a 00:03:18.009 LIB libspdk_trace_parser.a 00:03:18.268 CC lib/rdma/common.o 00:03:18.268 CC lib/rdma/rdma_verbs.o 00:03:18.268 CC lib/conf/conf.o 00:03:18.268 CC lib/vmd/vmd.o 00:03:18.268 CC lib/vmd/led.o 00:03:18.268 CC lib/idxd/idxd.o 00:03:18.268 CC lib/idxd/idxd_user.o 00:03:18.268 CC lib/json/json_parse.o 00:03:18.268 CC lib/json/json_util.o 00:03:18.268 CC lib/json/json_write.o 00:03:18.268 CC lib/env_dpdk/env.o 00:03:18.268 CC lib/env_dpdk/pci.o 00:03:18.268 CC lib/env_dpdk/memory.o 00:03:18.268 CC lib/env_dpdk/init.o 00:03:18.268 CC lib/env_dpdk/threads.o 00:03:18.268 CC lib/env_dpdk/pci_ioat.o 00:03:18.268 CC lib/env_dpdk/pci_virtio.o 00:03:18.268 CC lib/env_dpdk/pci_vmd.o 00:03:18.268 CC lib/env_dpdk/pci_idxd.o 00:03:18.268 CC lib/env_dpdk/pci_event.o 00:03:18.268 CC lib/env_dpdk/sigbus_handler.o 00:03:18.268 CC lib/env_dpdk/pci_dpdk.o 00:03:18.268 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:18.268 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:18.268 LIB libspdk_conf.a 00:03:18.268 LIB libspdk_rdma.a 00:03:18.527 LIB libspdk_json.a 00:03:18.527 LIB libspdk_idxd.a 00:03:18.527 LIB libspdk_vmd.a 00:03:18.786 CC lib/jsonrpc/jsonrpc_server.o 00:03:18.786 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:18.786 CC lib/jsonrpc/jsonrpc_client.o 00:03:18.786 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:18.786 LIB libspdk_jsonrpc.a 00:03:19.045 LIB libspdk_env_dpdk.a 00:03:19.304 CC lib/rpc/rpc.o 00:03:19.304 LIB libspdk_rpc.a 00:03:19.563 CC lib/notify/notify.o 00:03:19.563 CC lib/notify/notify_rpc.o 00:03:19.563 CC lib/keyring/keyring.o 00:03:19.563 CC lib/trace/trace.o 00:03:19.563 CC lib/keyring/keyring_rpc.o 00:03:19.563 CC lib/trace/trace_flags.o 00:03:19.563 CC lib/trace/trace_rpc.o 00:03:19.820 LIB libspdk_notify.a 00:03:19.820 LIB libspdk_trace.a 00:03:19.820 LIB libspdk_keyring.a 00:03:20.078 CC lib/thread/thread.o 00:03:20.078 CC lib/thread/iobuf.o 00:03:20.078 CC lib/sock/sock.o 00:03:20.078 CC lib/sock/sock_rpc.o 00:03:20.337 LIB libspdk_sock.a 00:03:20.903 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:20.903 CC lib/nvme/nvme_ns_cmd.o 00:03:20.903 CC lib/nvme/nvme_ctrlr.o 00:03:20.903 CC lib/nvme/nvme_fabric.o 00:03:20.903 CC lib/nvme/nvme_ns.o 00:03:20.903 CC lib/nvme/nvme_pcie_common.o 00:03:20.903 CC lib/nvme/nvme_pcie.o 00:03:20.904 CC lib/nvme/nvme_qpair.o 00:03:20.904 CC lib/nvme/nvme.o 00:03:20.904 CC lib/nvme/nvme_quirks.o 00:03:20.904 CC lib/nvme/nvme_transport.o 00:03:20.904 CC lib/nvme/nvme_discovery.o 00:03:20.904 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:20.904 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:20.904 CC lib/nvme/nvme_io_msg.o 00:03:20.904 CC lib/nvme/nvme_tcp.o 00:03:20.904 CC lib/nvme/nvme_opal.o 00:03:20.904 CC lib/nvme/nvme_poll_group.o 00:03:20.904 CC lib/nvme/nvme_zns.o 00:03:20.904 CC lib/nvme/nvme_stubs.o 00:03:20.904 CC lib/nvme/nvme_auth.o 00:03:20.904 CC lib/nvme/nvme_cuse.o 00:03:20.904 CC lib/nvme/nvme_vfio_user.o 00:03:20.904 CC lib/nvme/nvme_rdma.o 00:03:20.904 LIB libspdk_thread.a 00:03:21.162 CC lib/blob/blobstore.o 00:03:21.162 CC lib/blob/request.o 00:03:21.162 CC lib/blob/blob_bs_dev.o 00:03:21.162 CC lib/blob/zeroes.o 00:03:21.162 CC lib/vfu_tgt/tgt_endpoint.o 00:03:21.162 CC lib/vfu_tgt/tgt_rpc.o 00:03:21.162 CC lib/accel/accel.o 00:03:21.162 CC lib/accel/accel_rpc.o 00:03:21.162 CC lib/accel/accel_sw.o 00:03:21.162 CC lib/virtio/virtio.o 00:03:21.162 CC lib/virtio/virtio_vhost_user.o 00:03:21.162 CC lib/virtio/virtio_vfio_user.o 00:03:21.162 CC lib/virtio/virtio_pci.o 00:03:21.162 CC lib/init/json_config.o 00:03:21.162 CC lib/init/rpc.o 00:03:21.162 CC lib/init/subsystem.o 00:03:21.162 CC lib/init/subsystem_rpc.o 00:03:21.420 LIB libspdk_init.a 00:03:21.420 LIB libspdk_virtio.a 00:03:21.420 LIB libspdk_vfu_tgt.a 00:03:21.678 CC lib/event/app.o 00:03:21.678 CC lib/event/reactor.o 00:03:21.678 CC lib/event/app_rpc.o 00:03:21.678 CC lib/event/log_rpc.o 00:03:21.678 CC lib/event/scheduler_static.o 00:03:21.938 LIB libspdk_accel.a 00:03:21.938 LIB libspdk_event.a 00:03:21.938 LIB libspdk_nvme.a 00:03:22.195 CC lib/bdev/bdev.o 00:03:22.195 CC lib/bdev/bdev_rpc.o 00:03:22.195 CC lib/bdev/bdev_zone.o 00:03:22.195 CC lib/bdev/scsi_nvme.o 00:03:22.195 CC lib/bdev/part.o 00:03:22.762 LIB libspdk_blob.a 00:03:23.021 CC lib/blobfs/blobfs.o 00:03:23.021 CC lib/blobfs/tree.o 00:03:23.280 CC lib/lvol/lvol.o 00:03:23.538 LIB libspdk_lvol.a 00:03:23.538 LIB libspdk_blobfs.a 00:03:23.797 LIB libspdk_bdev.a 00:03:24.362 CC lib/nbd/nbd.o 00:03:24.362 CC lib/nbd/nbd_rpc.o 00:03:24.362 CC lib/ftl/ftl_core.o 00:03:24.362 CC lib/ftl/ftl_init.o 00:03:24.362 CC lib/ftl/ftl_layout.o 00:03:24.362 CC lib/ftl/ftl_io.o 00:03:24.362 CC lib/ftl/ftl_debug.o 00:03:24.362 CC lib/nvmf/ctrlr_discovery.o 00:03:24.362 CC lib/ftl/ftl_sb.o 00:03:24.362 CC lib/nvmf/ctrlr.o 00:03:24.362 CC lib/ftl/ftl_l2p.o 00:03:24.362 CC lib/scsi/dev.o 00:03:24.362 CC lib/ftl/ftl_l2p_flat.o 00:03:24.362 CC lib/nvmf/ctrlr_bdev.o 00:03:24.362 CC lib/ftl/ftl_nv_cache.o 00:03:24.362 CC lib/scsi/lun.o 00:03:24.362 CC lib/ftl/ftl_band.o 00:03:24.362 CC lib/nvmf/subsystem.o 00:03:24.362 CC lib/scsi/port.o 00:03:24.362 CC lib/ublk/ublk.o 00:03:24.362 CC lib/ftl/ftl_band_ops.o 00:03:24.362 CC lib/nvmf/nvmf.o 00:03:24.362 CC lib/ublk/ublk_rpc.o 00:03:24.362 CC lib/scsi/scsi.o 00:03:24.362 CC lib/ftl/ftl_writer.o 00:03:24.362 CC lib/nvmf/nvmf_rpc.o 00:03:24.362 CC lib/scsi/scsi_bdev.o 00:03:24.362 CC lib/ftl/ftl_rq.o 00:03:24.362 CC lib/nvmf/transport.o 00:03:24.362 CC lib/ftl/ftl_reloc.o 00:03:24.362 CC lib/scsi/scsi_pr.o 00:03:24.362 CC lib/nvmf/tcp.o 00:03:24.362 CC lib/ftl/ftl_l2p_cache.o 00:03:24.362 CC lib/scsi/scsi_rpc.o 00:03:24.362 CC lib/nvmf/vfio_user.o 00:03:24.362 CC lib/ftl/ftl_p2l.o 00:03:24.362 CC lib/scsi/task.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt.o 00:03:24.362 CC lib/nvmf/rdma.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:24.362 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:24.362 CC lib/ftl/utils/ftl_md.o 00:03:24.362 CC lib/ftl/utils/ftl_conf.o 00:03:24.362 CC lib/ftl/utils/ftl_bitmap.o 00:03:24.362 CC lib/ftl/utils/ftl_mempool.o 00:03:24.362 CC lib/ftl/utils/ftl_property.o 00:03:24.362 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:24.363 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:24.363 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:24.363 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:24.363 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:24.363 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:24.363 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:24.363 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:24.363 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:24.363 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:24.363 CC lib/ftl/base/ftl_base_bdev.o 00:03:24.363 CC lib/ftl/base/ftl_base_dev.o 00:03:24.363 CC lib/ftl/ftl_trace.o 00:03:24.620 LIB libspdk_nbd.a 00:03:24.620 LIB libspdk_scsi.a 00:03:24.879 LIB libspdk_ublk.a 00:03:24.879 LIB libspdk_ftl.a 00:03:24.879 CC lib/vhost/vhost.o 00:03:24.879 CC lib/vhost/vhost_rpc.o 00:03:24.879 CC lib/vhost/vhost_scsi.o 00:03:24.879 CC lib/vhost/vhost_blk.o 00:03:24.879 CC lib/vhost/rte_vhost_user.o 00:03:24.879 CC lib/iscsi/init_grp.o 00:03:24.879 CC lib/iscsi/iscsi.o 00:03:24.879 CC lib/iscsi/md5.o 00:03:24.879 CC lib/iscsi/conn.o 00:03:24.879 CC lib/iscsi/param.o 00:03:24.879 CC lib/iscsi/portal_grp.o 00:03:24.879 CC lib/iscsi/iscsi_subsystem.o 00:03:24.879 CC lib/iscsi/tgt_node.o 00:03:24.879 CC lib/iscsi/task.o 00:03:24.879 CC lib/iscsi/iscsi_rpc.o 00:03:25.447 LIB libspdk_nvmf.a 00:03:25.447 LIB libspdk_vhost.a 00:03:25.704 LIB libspdk_iscsi.a 00:03:26.272 CC module/vfu_device/vfu_virtio.o 00:03:26.272 CC module/vfu_device/vfu_virtio_blk.o 00:03:26.272 CC module/vfu_device/vfu_virtio_scsi.o 00:03:26.272 CC module/vfu_device/vfu_virtio_rpc.o 00:03:26.272 CC module/env_dpdk/env_dpdk_rpc.o 00:03:26.272 CC module/blob/bdev/blob_bdev.o 00:03:26.272 LIB libspdk_env_dpdk_rpc.a 00:03:26.272 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:26.272 CC module/accel/iaa/accel_iaa_rpc.o 00:03:26.272 CC module/accel/iaa/accel_iaa.o 00:03:26.272 CC module/scheduler/gscheduler/gscheduler.o 00:03:26.272 CC module/sock/posix/posix.o 00:03:26.272 CC module/keyring/file/keyring.o 00:03:26.272 CC module/keyring/file/keyring_rpc.o 00:03:26.272 CC module/accel/ioat/accel_ioat.o 00:03:26.272 CC module/accel/ioat/accel_ioat_rpc.o 00:03:26.272 CC module/accel/dsa/accel_dsa.o 00:03:26.272 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:26.272 CC module/accel/dsa/accel_dsa_rpc.o 00:03:26.272 CC module/accel/error/accel_error.o 00:03:26.272 CC module/accel/error/accel_error_rpc.o 00:03:26.531 LIB libspdk_scheduler_gscheduler.a 00:03:26.531 LIB libspdk_scheduler_dpdk_governor.a 00:03:26.531 LIB libspdk_keyring_file.a 00:03:26.531 LIB libspdk_scheduler_dynamic.a 00:03:26.531 LIB libspdk_blob_bdev.a 00:03:26.531 LIB libspdk_accel_error.a 00:03:26.531 LIB libspdk_accel_ioat.a 00:03:26.531 LIB libspdk_accel_iaa.a 00:03:26.531 LIB libspdk_accel_dsa.a 00:03:26.531 LIB libspdk_vfu_device.a 00:03:26.790 LIB libspdk_sock_posix.a 00:03:26.791 CC module/bdev/delay/vbdev_delay.o 00:03:26.791 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:27.049 CC module/bdev/iscsi/bdev_iscsi.o 00:03:27.049 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:27.049 CC module/bdev/lvol/vbdev_lvol.o 00:03:27.050 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:27.050 CC module/bdev/null/bdev_null.o 00:03:27.050 CC module/bdev/gpt/gpt.o 00:03:27.050 CC module/bdev/raid/bdev_raid.o 00:03:27.050 CC module/bdev/null/bdev_null_rpc.o 00:03:27.050 CC module/bdev/error/vbdev_error.o 00:03:27.050 CC module/bdev/raid/bdev_raid_rpc.o 00:03:27.050 CC module/bdev/raid/raid0.o 00:03:27.050 CC module/bdev/gpt/vbdev_gpt.o 00:03:27.050 CC module/bdev/error/vbdev_error_rpc.o 00:03:27.050 CC module/bdev/raid/bdev_raid_sb.o 00:03:27.050 CC module/bdev/aio/bdev_aio.o 00:03:27.050 CC module/bdev/raid/raid1.o 00:03:27.050 CC module/bdev/raid/concat.o 00:03:27.050 CC module/bdev/aio/bdev_aio_rpc.o 00:03:27.050 CC module/bdev/split/vbdev_split.o 00:03:27.050 CC module/bdev/nvme/bdev_nvme.o 00:03:27.050 CC module/blobfs/bdev/blobfs_bdev.o 00:03:27.050 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:27.050 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:27.050 CC module/bdev/split/vbdev_split_rpc.o 00:03:27.050 CC module/bdev/malloc/bdev_malloc.o 00:03:27.050 CC module/bdev/nvme/vbdev_opal.o 00:03:27.050 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:27.050 CC module/bdev/nvme/bdev_mdns_client.o 00:03:27.050 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:27.050 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:27.050 CC module/bdev/nvme/nvme_rpc.o 00:03:27.050 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:27.050 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:27.050 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:27.050 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:27.050 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:27.050 CC module/bdev/ftl/bdev_ftl.o 00:03:27.050 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:27.050 CC module/bdev/passthru/vbdev_passthru.o 00:03:27.050 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:27.050 LIB libspdk_blobfs_bdev.a 00:03:27.050 LIB libspdk_bdev_split.a 00:03:27.050 LIB libspdk_bdev_gpt.a 00:03:27.050 LIB libspdk_bdev_null.a 00:03:27.050 LIB libspdk_bdev_error.a 00:03:27.050 LIB libspdk_bdev_iscsi.a 00:03:27.050 LIB libspdk_bdev_ftl.a 00:03:27.308 LIB libspdk_bdev_passthru.a 00:03:27.308 LIB libspdk_bdev_aio.a 00:03:27.308 LIB libspdk_bdev_delay.a 00:03:27.308 LIB libspdk_bdev_zone_block.a 00:03:27.308 LIB libspdk_bdev_malloc.a 00:03:27.308 LIB libspdk_bdev_lvol.a 00:03:27.308 LIB libspdk_bdev_virtio.a 00:03:27.567 LIB libspdk_bdev_raid.a 00:03:28.257 LIB libspdk_bdev_nvme.a 00:03:28.825 CC module/event/subsystems/iobuf/iobuf.o 00:03:28.825 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:28.825 CC module/event/subsystems/sock/sock.o 00:03:28.825 CC module/event/subsystems/keyring/keyring.o 00:03:28.825 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:28.825 CC module/event/subsystems/vmd/vmd.o 00:03:28.825 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:28.825 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:28.825 CC module/event/subsystems/scheduler/scheduler.o 00:03:28.825 LIB libspdk_event_sock.a 00:03:28.825 LIB libspdk_event_vfu_tgt.a 00:03:28.825 LIB libspdk_event_iobuf.a 00:03:28.825 LIB libspdk_event_keyring.a 00:03:28.825 LIB libspdk_event_vmd.a 00:03:28.825 LIB libspdk_event_vhost_blk.a 00:03:28.825 LIB libspdk_event_scheduler.a 00:03:29.084 CC module/event/subsystems/accel/accel.o 00:03:29.343 LIB libspdk_event_accel.a 00:03:29.602 CC module/event/subsystems/bdev/bdev.o 00:03:29.602 LIB libspdk_event_bdev.a 00:03:30.170 CC module/event/subsystems/ublk/ublk.o 00:03:30.170 CC module/event/subsystems/nbd/nbd.o 00:03:30.170 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:30.170 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:30.170 CC module/event/subsystems/scsi/scsi.o 00:03:30.170 LIB libspdk_event_ublk.a 00:03:30.170 LIB libspdk_event_nbd.a 00:03:30.170 LIB libspdk_event_scsi.a 00:03:30.170 LIB libspdk_event_nvmf.a 00:03:30.430 CC module/event/subsystems/iscsi/iscsi.o 00:03:30.430 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:30.430 LIB libspdk_event_iscsi.a 00:03:30.688 LIB libspdk_event_vhost_scsi.a 00:03:30.951 CC app/spdk_nvme_perf/perf.o 00:03:30.951 CC app/spdk_nvme_discover/discovery_aer.o 00:03:30.951 CC app/spdk_lspci/spdk_lspci.o 00:03:30.951 TEST_HEADER include/spdk/accel_module.h 00:03:30.951 TEST_HEADER include/spdk/accel.h 00:03:30.951 CXX app/trace/trace.o 00:03:30.952 CC test/rpc_client/rpc_client_test.o 00:03:30.952 TEST_HEADER include/spdk/assert.h 00:03:30.952 TEST_HEADER include/spdk/base64.h 00:03:30.952 CC app/trace_record/trace_record.o 00:03:30.952 TEST_HEADER include/spdk/barrier.h 00:03:30.952 TEST_HEADER include/spdk/bdev.h 00:03:30.952 TEST_HEADER include/spdk/bdev_module.h 00:03:30.952 TEST_HEADER include/spdk/bdev_zone.h 00:03:30.952 TEST_HEADER include/spdk/bit_array.h 00:03:30.952 TEST_HEADER include/spdk/bit_pool.h 00:03:30.952 TEST_HEADER include/spdk/blob_bdev.h 00:03:30.952 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:30.952 TEST_HEADER include/spdk/blobfs.h 00:03:30.952 TEST_HEADER include/spdk/blob.h 00:03:30.952 TEST_HEADER include/spdk/config.h 00:03:30.952 TEST_HEADER include/spdk/conf.h 00:03:30.952 TEST_HEADER include/spdk/cpuset.h 00:03:30.952 TEST_HEADER include/spdk/crc16.h 00:03:30.952 TEST_HEADER include/spdk/crc32.h 00:03:30.952 TEST_HEADER include/spdk/crc64.h 00:03:30.952 TEST_HEADER include/spdk/dif.h 00:03:30.952 TEST_HEADER include/spdk/dma.h 00:03:30.952 CC app/spdk_top/spdk_top.o 00:03:30.952 TEST_HEADER include/spdk/endian.h 00:03:30.952 TEST_HEADER include/spdk/env_dpdk.h 00:03:30.952 TEST_HEADER include/spdk/env.h 00:03:30.952 TEST_HEADER include/spdk/event.h 00:03:30.952 TEST_HEADER include/spdk/fd_group.h 00:03:30.952 TEST_HEADER include/spdk/fd.h 00:03:30.952 TEST_HEADER include/spdk/file.h 00:03:30.952 CC app/spdk_nvme_identify/identify.o 00:03:30.952 TEST_HEADER include/spdk/gpt_spec.h 00:03:30.952 TEST_HEADER include/spdk/ftl.h 00:03:30.952 TEST_HEADER include/spdk/hexlify.h 00:03:30.952 TEST_HEADER include/spdk/histogram_data.h 00:03:30.952 TEST_HEADER include/spdk/idxd.h 00:03:30.952 TEST_HEADER include/spdk/idxd_spec.h 00:03:30.952 TEST_HEADER include/spdk/init.h 00:03:30.952 TEST_HEADER include/spdk/ioat_spec.h 00:03:30.952 TEST_HEADER include/spdk/ioat.h 00:03:30.952 TEST_HEADER include/spdk/iscsi_spec.h 00:03:30.952 TEST_HEADER include/spdk/json.h 00:03:30.952 TEST_HEADER include/spdk/jsonrpc.h 00:03:30.952 TEST_HEADER include/spdk/keyring.h 00:03:30.952 TEST_HEADER include/spdk/keyring_module.h 00:03:30.952 TEST_HEADER include/spdk/likely.h 00:03:30.952 TEST_HEADER include/spdk/log.h 00:03:30.952 TEST_HEADER include/spdk/lvol.h 00:03:30.952 TEST_HEADER include/spdk/memory.h 00:03:30.952 TEST_HEADER include/spdk/mmio.h 00:03:30.952 TEST_HEADER include/spdk/notify.h 00:03:30.952 TEST_HEADER include/spdk/nbd.h 00:03:30.952 TEST_HEADER include/spdk/nvme.h 00:03:30.952 TEST_HEADER include/spdk/nvme_intel.h 00:03:30.952 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:30.952 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:30.952 TEST_HEADER include/spdk/nvme_spec.h 00:03:30.952 TEST_HEADER include/spdk/nvme_zns.h 00:03:30.952 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:30.952 CC app/iscsi_tgt/iscsi_tgt.o 00:03:30.952 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:30.952 CC app/spdk_dd/spdk_dd.o 00:03:30.952 TEST_HEADER include/spdk/nvmf.h 00:03:30.952 TEST_HEADER include/spdk/nvmf_transport.h 00:03:30.952 TEST_HEADER include/spdk/nvmf_spec.h 00:03:30.952 TEST_HEADER include/spdk/opal.h 00:03:30.952 CC app/nvmf_tgt/nvmf_main.o 00:03:30.952 TEST_HEADER include/spdk/opal_spec.h 00:03:30.952 TEST_HEADER include/spdk/pci_ids.h 00:03:30.952 TEST_HEADER include/spdk/pipe.h 00:03:30.952 TEST_HEADER include/spdk/queue.h 00:03:30.952 TEST_HEADER include/spdk/reduce.h 00:03:30.952 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:30.952 TEST_HEADER include/spdk/rpc.h 00:03:30.952 TEST_HEADER include/spdk/scheduler.h 00:03:30.952 TEST_HEADER include/spdk/scsi.h 00:03:30.952 TEST_HEADER include/spdk/scsi_spec.h 00:03:30.952 TEST_HEADER include/spdk/sock.h 00:03:30.952 TEST_HEADER include/spdk/stdinc.h 00:03:30.952 TEST_HEADER include/spdk/string.h 00:03:30.952 TEST_HEADER include/spdk/trace.h 00:03:30.952 TEST_HEADER include/spdk/thread.h 00:03:30.952 TEST_HEADER include/spdk/trace_parser.h 00:03:30.952 TEST_HEADER include/spdk/tree.h 00:03:30.952 TEST_HEADER include/spdk/ublk.h 00:03:30.952 TEST_HEADER include/spdk/util.h 00:03:30.952 TEST_HEADER include/spdk/uuid.h 00:03:30.952 TEST_HEADER include/spdk/version.h 00:03:30.952 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:30.952 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:30.952 TEST_HEADER include/spdk/vhost.h 00:03:30.952 TEST_HEADER include/spdk/vmd.h 00:03:30.952 TEST_HEADER include/spdk/xor.h 00:03:30.952 TEST_HEADER include/spdk/zipf.h 00:03:30.952 CXX test/cpp_headers/accel.o 00:03:30.952 CXX test/cpp_headers/accel_module.o 00:03:30.952 CC app/vhost/vhost.o 00:03:30.952 CXX test/cpp_headers/assert.o 00:03:30.952 CXX test/cpp_headers/barrier.o 00:03:30.952 CXX test/cpp_headers/base64.o 00:03:30.952 CXX test/cpp_headers/bdev.o 00:03:30.952 CXX test/cpp_headers/bdev_module.o 00:03:30.952 CXX test/cpp_headers/bdev_zone.o 00:03:30.952 CXX test/cpp_headers/bit_array.o 00:03:30.952 CXX test/cpp_headers/bit_pool.o 00:03:30.952 CXX test/cpp_headers/blob_bdev.o 00:03:30.952 CXX test/cpp_headers/blobfs_bdev.o 00:03:30.952 CXX test/cpp_headers/blobfs.o 00:03:30.952 CXX test/cpp_headers/blob.o 00:03:30.952 CXX test/cpp_headers/conf.o 00:03:30.952 CXX test/cpp_headers/config.o 00:03:30.952 CXX test/cpp_headers/cpuset.o 00:03:30.952 CXX test/cpp_headers/crc16.o 00:03:30.952 CXX test/cpp_headers/crc32.o 00:03:30.952 CXX test/cpp_headers/dif.o 00:03:30.952 CXX test/cpp_headers/crc64.o 00:03:30.952 CC app/spdk_tgt/spdk_tgt.o 00:03:30.952 CXX test/cpp_headers/dma.o 00:03:30.952 CXX test/cpp_headers/endian.o 00:03:30.952 CXX test/cpp_headers/env.o 00:03:30.952 CXX test/cpp_headers/env_dpdk.o 00:03:30.952 CXX test/cpp_headers/event.o 00:03:30.952 CXX test/cpp_headers/fd_group.o 00:03:30.952 CXX test/cpp_headers/fd.o 00:03:30.952 CXX test/cpp_headers/ftl.o 00:03:30.952 CXX test/cpp_headers/file.o 00:03:30.952 CXX test/cpp_headers/gpt_spec.o 00:03:30.952 CXX test/cpp_headers/hexlify.o 00:03:30.952 CXX test/cpp_headers/histogram_data.o 00:03:30.952 CXX test/cpp_headers/idxd.o 00:03:30.952 CXX test/cpp_headers/idxd_spec.o 00:03:30.952 CXX test/cpp_headers/init.o 00:03:30.952 CC test/event/reactor/reactor.o 00:03:30.952 CC test/event/event_perf/event_perf.o 00:03:30.952 CC test/event/reactor_perf/reactor_perf.o 00:03:30.952 CC test/event/app_repeat/app_repeat.o 00:03:30.952 CC examples/nvme/hello_world/hello_world.o 00:03:30.952 CC examples/idxd/perf/perf.o 00:03:30.952 CC test/env/vtophys/vtophys.o 00:03:30.952 CC examples/nvme/reconnect/reconnect.o 00:03:30.952 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:30.952 CC test/nvme/reset/reset.o 00:03:30.952 CC examples/nvme/arbitration/arbitration.o 00:03:30.952 CC examples/accel/perf/accel_perf.o 00:03:30.952 CC test/env/pci/pci_ut.o 00:03:30.952 CC test/app/histogram_perf/histogram_perf.o 00:03:30.952 CC examples/ioat/verify/verify.o 00:03:30.952 CC test/thread/lock/spdk_lock.o 00:03:30.952 CC examples/sock/hello_world/hello_sock.o 00:03:30.952 CC test/nvme/aer/aer.o 00:03:30.952 CC test/app/jsoncat/jsoncat.o 00:03:30.952 CC examples/ioat/perf/perf.o 00:03:30.952 CC test/thread/poller_perf/poller_perf.o 00:03:30.952 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:30.952 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:30.952 CC test/env/memory/memory_ut.o 00:03:30.952 CC test/nvme/compliance/nvme_compliance.o 00:03:30.952 CXX test/cpp_headers/ioat.o 00:03:30.952 CC test/nvme/simple_copy/simple_copy.o 00:03:30.952 CC test/nvme/sgl/sgl.o 00:03:30.952 CC test/nvme/overhead/overhead.o 00:03:30.952 CC test/nvme/err_injection/err_injection.o 00:03:30.952 CC test/app/stub/stub.o 00:03:30.952 CC examples/nvme/hotplug/hotplug.o 00:03:30.952 CC test/event/scheduler/scheduler.o 00:03:31.225 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:31.225 CC examples/util/zipf/zipf.o 00:03:31.225 CC test/nvme/fdp/fdp.o 00:03:31.225 CC test/nvme/e2edp/nvme_dp.o 00:03:31.225 CC test/nvme/fused_ordering/fused_ordering.o 00:03:31.225 CC examples/nvme/abort/abort.o 00:03:31.225 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:31.225 CC test/nvme/startup/startup.o 00:03:31.225 CC test/nvme/reserve/reserve.o 00:03:31.225 CC test/nvme/connect_stress/connect_stress.o 00:03:31.225 CC test/nvme/cuse/cuse.o 00:03:31.225 CC examples/vmd/lsvmd/lsvmd.o 00:03:31.225 CC examples/vmd/led/led.o 00:03:31.225 CC test/bdev/bdevio/bdevio.o 00:03:31.225 CC test/nvme/boot_partition/boot_partition.o 00:03:31.225 CC app/fio/nvme/fio_plugin.o 00:03:31.225 CC examples/thread/thread/thread_ex.o 00:03:31.225 LINK spdk_lspci 00:03:31.225 CC test/accel/dif/dif.o 00:03:31.225 CC test/app/bdev_svc/bdev_svc.o 00:03:31.225 CC examples/blob/cli/blobcli.o 00:03:31.225 CC examples/blob/hello_world/hello_blob.o 00:03:31.225 CC examples/bdev/hello_world/hello_bdev.o 00:03:31.225 CC app/fio/bdev/fio_plugin.o 00:03:31.225 CC test/dma/test_dma/test_dma.o 00:03:31.225 CC examples/nvmf/nvmf/nvmf.o 00:03:31.225 CC test/blobfs/mkfs/mkfs.o 00:03:31.225 CC examples/bdev/bdevperf/bdevperf.o 00:03:31.225 LINK spdk_nvme_discover 00:03:31.225 LINK rpc_client_test 00:03:31.225 CC test/lvol/esnap/esnap.o 00:03:31.225 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:31.225 CC test/env/mem_callbacks/mem_callbacks.o 00:03:31.225 LINK interrupt_tgt 00:03:31.225 LINK spdk_trace_record 00:03:31.225 LINK nvmf_tgt 00:03:31.225 LINK reactor 00:03:31.225 CXX test/cpp_headers/ioat_spec.o 00:03:31.225 CXX test/cpp_headers/iscsi_spec.o 00:03:31.225 LINK event_perf 00:03:31.225 CXX test/cpp_headers/json.o 00:03:31.225 LINK reactor_perf 00:03:31.225 CXX test/cpp_headers/jsonrpc.o 00:03:31.225 CXX test/cpp_headers/keyring.o 00:03:31.225 CXX test/cpp_headers/keyring_module.o 00:03:31.225 CXX test/cpp_headers/likely.o 00:03:31.225 CXX test/cpp_headers/log.o 00:03:31.225 CXX test/cpp_headers/lvol.o 00:03:31.225 CXX test/cpp_headers/memory.o 00:03:31.225 CXX test/cpp_headers/mmio.o 00:03:31.225 CXX test/cpp_headers/nbd.o 00:03:31.225 CXX test/cpp_headers/notify.o 00:03:31.225 LINK app_repeat 00:03:31.225 CXX test/cpp_headers/nvme.o 00:03:31.225 CXX test/cpp_headers/nvme_intel.o 00:03:31.225 CXX test/cpp_headers/nvme_ocssd.o 00:03:31.225 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:31.225 CXX test/cpp_headers/nvme_spec.o 00:03:31.225 CXX test/cpp_headers/nvme_zns.o 00:03:31.225 LINK lsvmd 00:03:31.225 CXX test/cpp_headers/nvmf_cmd.o 00:03:31.225 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:31.225 LINK vtophys 00:03:31.225 CXX test/cpp_headers/nvmf.o 00:03:31.225 CXX test/cpp_headers/nvmf_spec.o 00:03:31.225 LINK jsoncat 00:03:31.225 LINK vhost 00:03:31.225 CXX test/cpp_headers/nvmf_transport.o 00:03:31.225 LINK iscsi_tgt 00:03:31.225 LINK histogram_perf 00:03:31.225 LINK led 00:03:31.225 CXX test/cpp_headers/opal_spec.o 00:03:31.225 CXX test/cpp_headers/opal.o 00:03:31.225 CXX test/cpp_headers/pci_ids.o 00:03:31.225 CXX test/cpp_headers/pipe.o 00:03:31.225 CXX test/cpp_headers/queue.o 00:03:31.225 CXX test/cpp_headers/reduce.o 00:03:31.225 LINK poller_perf 00:03:31.225 LINK zipf 00:03:31.225 LINK env_dpdk_post_init 00:03:31.225 LINK spdk_tgt 00:03:31.225 CXX test/cpp_headers/rpc.o 00:03:31.225 LINK err_injection 00:03:31.225 LINK doorbell_aers 00:03:31.225 LINK connect_stress 00:03:31.227 CXX test/cpp_headers/scheduler.o 00:03:31.227 CXX test/cpp_headers/scsi.o 00:03:31.488 LINK boot_partition 00:03:31.488 LINK startup 00:03:31.488 LINK stub 00:03:31.488 LINK ioat_perf 00:03:31.488 LINK pmr_persistence 00:03:31.488 LINK fused_ordering 00:03:31.488 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:31.488 struct spdk_nvme_fdp_ruhs ruhs; 00:03:31.489 ^ 00:03:31.489 LINK bdev_svc 00:03:31.489 LINK simple_copy 00:03:31.489 CXX test/cpp_headers/scsi_spec.o 00:03:31.489 LINK verify 00:03:31.489 LINK reserve 00:03:31.489 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:31.489 LINK hotplug 00:03:31.489 LINK hello_world 00:03:31.489 LINK cmb_copy 00:03:31.489 LINK hello_sock 00:03:31.489 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:31.489 LINK nvme_dp 00:03:31.489 LINK scheduler 00:03:31.489 LINK fdp 00:03:31.489 LINK mkfs 00:03:31.489 LINK overhead 00:03:31.489 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:31.489 LINK aer 00:03:31.489 LINK thread 00:03:31.489 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:31.489 LINK spdk_trace 00:03:31.489 LINK reset 00:03:31.489 LINK sgl 00:03:31.489 LINK hello_blob 00:03:31.489 LINK idxd_perf 00:03:31.489 LINK hello_bdev 00:03:31.489 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:31.489 CXX test/cpp_headers/sock.o 00:03:31.489 CXX test/cpp_headers/stdinc.o 00:03:31.489 CXX test/cpp_headers/string.o 00:03:31.489 CXX test/cpp_headers/thread.o 00:03:31.489 CXX test/cpp_headers/trace.o 00:03:31.489 CXX test/cpp_headers/trace_parser.o 00:03:31.489 CXX test/cpp_headers/tree.o 00:03:31.489 CXX test/cpp_headers/ublk.o 00:03:31.489 LINK arbitration 00:03:31.489 CXX test/cpp_headers/util.o 00:03:31.489 CXX test/cpp_headers/uuid.o 00:03:31.489 CXX test/cpp_headers/version.o 00:03:31.489 CXX test/cpp_headers/vfio_user_pci.o 00:03:31.489 LINK reconnect 00:03:31.489 CXX test/cpp_headers/vfio_user_spec.o 00:03:31.489 CXX test/cpp_headers/vhost.o 00:03:31.489 CXX test/cpp_headers/vmd.o 00:03:31.489 CXX test/cpp_headers/xor.o 00:03:31.489 CXX test/cpp_headers/zipf.o 00:03:31.489 LINK nvmf 00:03:31.748 LINK abort 00:03:31.748 LINK dif 00:03:31.748 LINK bdevio 00:03:31.748 LINK nvme_manage 00:03:31.748 LINK spdk_dd 00:03:31.748 LINK test_dma 00:03:31.748 LINK nvme_compliance 00:03:31.748 LINK pci_ut 00:03:31.748 LINK accel_perf 00:03:31.748 1 warning generated. 00:03:31.748 LINK mem_callbacks 00:03:31.748 LINK llvm_vfio_fuzz 00:03:31.748 LINK blobcli 00:03:31.748 LINK nvme_fuzz 00:03:31.748 LINK spdk_nvme 00:03:32.007 LINK spdk_bdev 00:03:32.007 LINK spdk_nvme_identify 00:03:32.007 LINK spdk_top 00:03:32.007 LINK vhost_fuzz 00:03:32.007 LINK spdk_nvme_perf 00:03:32.265 LINK bdevperf 00:03:32.265 LINK memory_ut 00:03:32.265 LINK cuse 00:03:32.265 LINK llvm_nvme_fuzz 00:03:32.833 LINK iscsi_fuzz 00:03:32.833 LINK spdk_lock 00:03:34.741 LINK esnap 00:03:35.309 00:03:35.309 real 0m23.997s 00:03:35.309 user 4m44.573s 00:03:35.309 sys 1m57.858s 00:03:35.309 20:50:50 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:03:35.309 20:50:50 -- common/autotest_common.sh@10 -- $ set +x 00:03:35.309 ************************************ 00:03:35.309 END TEST make 00:03:35.309 ************************************ 00:03:35.309 20:50:50 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:35.309 20:50:50 -- pm/common@30 -- $ signal_monitor_resources TERM 00:03:35.309 20:50:50 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:03:35.309 20:50:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.309 20:50:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:35.309 20:50:50 -- pm/common@45 -- $ pid=48617 00:03:35.309 20:50:50 -- pm/common@52 -- $ sudo kill -TERM 48617 00:03:35.309 20:50:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.309 20:50:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:35.309 20:50:50 -- pm/common@45 -- $ pid=48615 00:03:35.309 20:50:50 -- pm/common@52 -- $ sudo kill -TERM 48615 00:03:35.309 20:50:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.309 20:50:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:35.309 20:50:50 -- pm/common@45 -- $ pid=48612 00:03:35.309 20:50:50 -- pm/common@52 -- $ sudo kill -TERM 48612 00:03:35.309 20:50:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.309 20:50:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:35.309 20:50:50 -- pm/common@45 -- $ pid=48618 00:03:35.309 20:50:50 -- pm/common@52 -- $ sudo kill -TERM 48618 00:03:35.569 20:50:50 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:35.569 20:50:50 -- nvmf/common.sh@7 -- # uname -s 00:03:35.569 20:50:50 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:35.569 20:50:50 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:35.569 20:50:50 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:35.569 20:50:50 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:35.569 20:50:50 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:35.569 20:50:50 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:35.569 20:50:50 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:35.569 20:50:50 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:35.569 20:50:50 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:35.569 20:50:50 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:35.569 20:50:51 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:35.569 20:50:51 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:35.569 20:50:51 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:35.569 20:50:51 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:35.569 20:50:51 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:35.569 20:50:51 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:35.569 20:50:51 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:35.569 20:50:51 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:35.569 20:50:51 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:35.569 20:50:51 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:35.569 20:50:51 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:35.569 20:50:51 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:35.569 20:50:51 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:35.569 20:50:51 -- paths/export.sh@5 -- # export PATH 00:03:35.569 20:50:51 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:35.569 20:50:51 -- nvmf/common.sh@47 -- # : 0 00:03:35.569 20:50:51 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:35.569 20:50:51 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:35.569 20:50:51 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:35.569 20:50:51 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:35.569 20:50:51 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:35.569 20:50:51 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:35.569 20:50:51 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:35.569 20:50:51 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:35.569 20:50:51 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:35.569 20:50:51 -- spdk/autotest.sh@32 -- # uname -s 00:03:35.569 20:50:51 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:35.569 20:50:51 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:35.569 20:50:51 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:35.569 20:50:51 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:35.569 20:50:51 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:35.569 20:50:51 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:35.569 20:50:51 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:35.569 20:50:51 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:35.569 20:50:51 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:35.569 20:50:51 -- spdk/autotest.sh@48 -- # udevadm_pid=124314 00:03:35.569 20:50:51 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:35.569 20:50:51 -- pm/common@17 -- # local monitor 00:03:35.569 20:50:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.569 20:50:51 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=124316 00:03:35.569 20:50:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.569 20:50:51 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=124318 00:03:35.569 20:50:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.569 20:50:51 -- pm/common@21 -- # date +%s 00:03:35.569 20:50:51 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=124321 00:03:35.569 20:50:51 -- pm/common@21 -- # date +%s 00:03:35.569 20:50:51 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:35.569 20:50:51 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=124325 00:03:35.569 20:50:51 -- pm/common@21 -- # date +%s 00:03:35.569 20:50:51 -- pm/common@26 -- # sleep 1 00:03:35.569 20:50:51 -- pm/common@21 -- # date +%s 00:03:35.569 20:50:51 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714071051 00:03:35.569 20:50:51 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714071051 00:03:35.569 20:50:51 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714071051 00:03:35.569 20:50:51 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714071051 00:03:35.569 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714071051_collect-vmstat.pm.log 00:03:35.569 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714071051_collect-bmc-pm.bmc.pm.log 00:03:35.569 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714071051_collect-cpu-load.pm.log 00:03:35.569 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714071051_collect-cpu-temp.pm.log 00:03:36.505 20:50:52 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:36.505 20:50:52 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:36.505 20:50:52 -- common/autotest_common.sh@710 -- # xtrace_disable 00:03:36.505 20:50:52 -- common/autotest_common.sh@10 -- # set +x 00:03:36.505 20:50:52 -- spdk/autotest.sh@59 -- # create_test_list 00:03:36.505 20:50:52 -- common/autotest_common.sh@734 -- # xtrace_disable 00:03:36.505 20:50:52 -- common/autotest_common.sh@10 -- # set +x 00:03:36.505 20:50:52 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:36.505 20:50:52 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:36.505 20:50:52 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:36.505 20:50:52 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:36.505 20:50:52 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:36.505 20:50:52 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:36.505 20:50:52 -- common/autotest_common.sh@1441 -- # uname 00:03:36.505 20:50:52 -- common/autotest_common.sh@1441 -- # '[' Linux = FreeBSD ']' 00:03:36.505 20:50:52 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:36.505 20:50:52 -- common/autotest_common.sh@1461 -- # uname 00:03:36.505 20:50:52 -- common/autotest_common.sh@1461 -- # [[ Linux = FreeBSD ]] 00:03:36.505 20:50:52 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:36.505 20:50:52 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=clang 00:03:36.505 20:50:52 -- spdk/autotest.sh@72 -- # hash lcov 00:03:36.505 20:50:52 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:36.505 20:50:52 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:36.505 20:50:52 -- common/autotest_common.sh@710 -- # xtrace_disable 00:03:36.505 20:50:52 -- common/autotest_common.sh@10 -- # set +x 00:03:36.505 20:50:52 -- spdk/autotest.sh@91 -- # rm -f 00:03:36.505 20:50:52 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:39.794 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:39.794 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:39.794 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:40.053 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:40.312 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:40.312 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:40.312 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:40.312 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:40.312 20:50:55 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:40.312 20:50:55 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:03:40.312 20:50:55 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:03:40.312 20:50:55 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:03:40.312 20:50:55 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:03:40.312 20:50:55 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:03:40.312 20:50:55 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:03:40.312 20:50:55 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:40.312 20:50:55 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:03:40.312 20:50:55 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:40.312 20:50:55 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:40.312 20:50:55 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:40.312 20:50:55 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:40.312 20:50:55 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:40.312 20:50:55 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:40.312 No valid GPT data, bailing 00:03:40.312 20:50:55 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:40.312 20:50:55 -- scripts/common.sh@391 -- # pt= 00:03:40.312 20:50:55 -- scripts/common.sh@392 -- # return 1 00:03:40.312 20:50:55 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:40.312 1+0 records in 00:03:40.312 1+0 records out 00:03:40.312 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00657117 s, 160 MB/s 00:03:40.312 20:50:55 -- spdk/autotest.sh@118 -- # sync 00:03:40.312 20:50:55 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:40.312 20:50:55 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:40.312 20:50:55 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:46.879 20:51:02 -- spdk/autotest.sh@124 -- # uname -s 00:03:46.879 20:51:02 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:46.879 20:51:02 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:46.879 20:51:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:47.138 20:51:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:47.138 20:51:02 -- common/autotest_common.sh@10 -- # set +x 00:03:47.138 ************************************ 00:03:47.138 START TEST setup.sh 00:03:47.138 ************************************ 00:03:47.138 20:51:02 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:47.398 * Looking for test storage... 00:03:47.398 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:47.398 20:51:02 -- setup/test-setup.sh@10 -- # uname -s 00:03:47.398 20:51:02 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:47.398 20:51:02 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:47.398 20:51:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:47.398 20:51:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:47.398 20:51:02 -- common/autotest_common.sh@10 -- # set +x 00:03:47.398 ************************************ 00:03:47.398 START TEST acl 00:03:47.398 ************************************ 00:03:47.398 20:51:02 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:47.660 * Looking for test storage... 00:03:47.660 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:47.660 20:51:03 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:47.660 20:51:03 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:03:47.660 20:51:03 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:03:47.660 20:51:03 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:03:47.660 20:51:03 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:03:47.660 20:51:03 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:03:47.660 20:51:03 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:03:47.660 20:51:03 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:47.660 20:51:03 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:03:47.660 20:51:03 -- setup/acl.sh@12 -- # devs=() 00:03:47.660 20:51:03 -- setup/acl.sh@12 -- # declare -a devs 00:03:47.660 20:51:03 -- setup/acl.sh@13 -- # drivers=() 00:03:47.660 20:51:03 -- setup/acl.sh@13 -- # declare -A drivers 00:03:47.660 20:51:03 -- setup/acl.sh@51 -- # setup reset 00:03:47.660 20:51:03 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:47.660 20:51:03 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:51.852 20:51:06 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:51.852 20:51:06 -- setup/acl.sh@16 -- # local dev driver 00:03:51.852 20:51:06 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:51.852 20:51:06 -- setup/acl.sh@15 -- # setup output status 00:03:51.852 20:51:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.852 20:51:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:54.382 Hugepages 00:03:54.382 node hugesize free / total 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # continue 00:03:54.382 20:51:09 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # continue 00:03:54.382 20:51:09 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # continue 00:03:54.382 20:51:09 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 00:03:54.382 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:54.382 20:51:09 -- setup/acl.sh@19 -- # continue 00:03:54.382 20:51:09 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.382 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.382 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.382 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # continue 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.642 20:51:10 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:54.642 20:51:10 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:54.642 20:51:10 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:54.642 20:51:10 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:54.642 20:51:10 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:54.643 20:51:10 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:54.643 20:51:10 -- setup/acl.sh@54 -- # run_test denied denied 00:03:54.643 20:51:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:54.643 20:51:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:54.643 20:51:10 -- common/autotest_common.sh@10 -- # set +x 00:03:54.902 ************************************ 00:03:54.902 START TEST denied 00:03:54.902 ************************************ 00:03:54.902 20:51:10 -- common/autotest_common.sh@1111 -- # denied 00:03:54.902 20:51:10 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:54.902 20:51:10 -- setup/acl.sh@38 -- # setup output config 00:03:54.902 20:51:10 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:54.902 20:51:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:54.902 20:51:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:58.194 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:58.194 20:51:13 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:58.194 20:51:13 -- setup/acl.sh@28 -- # local dev driver 00:03:58.194 20:51:13 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:58.194 20:51:13 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:58.194 20:51:13 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:58.194 20:51:13 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:58.194 20:51:13 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:58.194 20:51:13 -- setup/acl.sh@41 -- # setup reset 00:03:58.194 20:51:13 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.194 20:51:13 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:02.387 00:04:02.387 real 0m7.645s 00:04:02.387 user 0m2.438s 00:04:02.387 sys 0m4.505s 00:04:02.387 20:51:18 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:02.387 20:51:18 -- common/autotest_common.sh@10 -- # set +x 00:04:02.387 ************************************ 00:04:02.387 END TEST denied 00:04:02.387 ************************************ 00:04:02.646 20:51:18 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:02.646 20:51:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:02.646 20:51:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:02.646 20:51:18 -- common/autotest_common.sh@10 -- # set +x 00:04:02.646 ************************************ 00:04:02.646 START TEST allowed 00:04:02.646 ************************************ 00:04:02.646 20:51:18 -- common/autotest_common.sh@1111 -- # allowed 00:04:02.646 20:51:18 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:02.646 20:51:18 -- setup/acl.sh@45 -- # setup output config 00:04:02.646 20:51:18 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:02.646 20:51:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.646 20:51:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:07.938 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:07.938 20:51:23 -- setup/acl.sh@47 -- # verify 00:04:07.938 20:51:23 -- setup/acl.sh@28 -- # local dev driver 00:04:07.938 20:51:23 -- setup/acl.sh@48 -- # setup reset 00:04:07.938 20:51:23 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:07.938 20:51:23 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:11.229 00:04:11.229 real 0m8.478s 00:04:11.229 user 0m2.354s 00:04:11.229 sys 0m4.630s 00:04:11.229 20:51:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:11.229 20:51:26 -- common/autotest_common.sh@10 -- # set +x 00:04:11.229 ************************************ 00:04:11.229 END TEST allowed 00:04:11.229 ************************************ 00:04:11.229 00:04:11.229 real 0m23.752s 00:04:11.229 user 0m7.616s 00:04:11.229 sys 0m14.183s 00:04:11.229 20:51:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:11.229 20:51:26 -- common/autotest_common.sh@10 -- # set +x 00:04:11.229 ************************************ 00:04:11.229 END TEST acl 00:04:11.229 ************************************ 00:04:11.229 20:51:26 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:11.229 20:51:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.229 20:51:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.229 20:51:26 -- common/autotest_common.sh@10 -- # set +x 00:04:11.492 ************************************ 00:04:11.492 START TEST hugepages 00:04:11.492 ************************************ 00:04:11.492 20:51:26 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:11.492 * Looking for test storage... 00:04:11.492 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:11.492 20:51:27 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:11.492 20:51:27 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:11.492 20:51:27 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:11.493 20:51:27 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:11.493 20:51:27 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:11.493 20:51:27 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:11.493 20:51:27 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:11.493 20:51:27 -- setup/common.sh@18 -- # local node= 00:04:11.493 20:51:27 -- setup/common.sh@19 -- # local var val 00:04:11.493 20:51:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.493 20:51:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.493 20:51:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.493 20:51:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.493 20:51:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.493 20:51:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 39243408 kB' 'MemAvailable: 42851788 kB' 'Buffers: 10416 kB' 'Cached: 12755648 kB' 'SwapCached: 0 kB' 'Active: 9947904 kB' 'Inactive: 3333564 kB' 'Active(anon): 9282812 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518696 kB' 'Mapped: 184080 kB' 'Shmem: 8785344 kB' 'KReclaimable: 249600 kB' 'Slab: 769092 kB' 'SReclaimable: 249600 kB' 'SUnreclaim: 519492 kB' 'KernelStack: 22144 kB' 'PageTables: 8392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439060 kB' 'Committed_AS: 10601008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213544 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.493 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.493 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # continue 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.494 20:51:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.494 20:51:27 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:11.494 20:51:27 -- setup/common.sh@33 -- # echo 2048 00:04:11.494 20:51:27 -- setup/common.sh@33 -- # return 0 00:04:11.494 20:51:27 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:11.494 20:51:27 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:11.494 20:51:27 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:11.494 20:51:27 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:11.494 20:51:27 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:11.494 20:51:27 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:11.494 20:51:27 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:11.494 20:51:27 -- setup/hugepages.sh@207 -- # get_nodes 00:04:11.494 20:51:27 -- setup/hugepages.sh@27 -- # local node 00:04:11.494 20:51:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.494 20:51:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:11.494 20:51:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.494 20:51:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:11.494 20:51:27 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:11.494 20:51:27 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:11.494 20:51:27 -- setup/hugepages.sh@208 -- # clear_hp 00:04:11.494 20:51:27 -- setup/hugepages.sh@37 -- # local node hp 00:04:11.494 20:51:27 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:11.494 20:51:27 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.494 20:51:27 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.494 20:51:27 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.494 20:51:27 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.494 20:51:27 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:11.494 20:51:27 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.494 20:51:27 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.494 20:51:27 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.494 20:51:27 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.494 20:51:27 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:11.494 20:51:27 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:11.494 20:51:27 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:11.494 20:51:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.494 20:51:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.494 20:51:27 -- common/autotest_common.sh@10 -- # set +x 00:04:11.819 ************************************ 00:04:11.819 START TEST default_setup 00:04:11.819 ************************************ 00:04:11.819 20:51:27 -- common/autotest_common.sh@1111 -- # default_setup 00:04:11.819 20:51:27 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:11.819 20:51:27 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:11.819 20:51:27 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:11.819 20:51:27 -- setup/hugepages.sh@51 -- # shift 00:04:11.819 20:51:27 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:11.819 20:51:27 -- setup/hugepages.sh@52 -- # local node_ids 00:04:11.819 20:51:27 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:11.819 20:51:27 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:11.819 20:51:27 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:11.819 20:51:27 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:11.819 20:51:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:11.819 20:51:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:11.819 20:51:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:11.819 20:51:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:11.819 20:51:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:11.819 20:51:27 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:11.819 20:51:27 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:11.819 20:51:27 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:11.819 20:51:27 -- setup/hugepages.sh@73 -- # return 0 00:04:11.819 20:51:27 -- setup/hugepages.sh@137 -- # setup output 00:04:11.819 20:51:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.819 20:51:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:15.156 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:15.156 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:16.539 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:16.539 20:51:32 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:16.539 20:51:32 -- setup/hugepages.sh@89 -- # local node 00:04:16.539 20:51:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:16.539 20:51:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:16.539 20:51:32 -- setup/hugepages.sh@92 -- # local surp 00:04:16.539 20:51:32 -- setup/hugepages.sh@93 -- # local resv 00:04:16.539 20:51:32 -- setup/hugepages.sh@94 -- # local anon 00:04:16.539 20:51:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:16.540 20:51:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:16.540 20:51:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:16.540 20:51:32 -- setup/common.sh@18 -- # local node= 00:04:16.540 20:51:32 -- setup/common.sh@19 -- # local var val 00:04:16.540 20:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.540 20:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.540 20:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.540 20:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.540 20:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.540 20:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41442336 kB' 'MemAvailable: 45050724 kB' 'Buffers: 10416 kB' 'Cached: 12755768 kB' 'SwapCached: 0 kB' 'Active: 9963296 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298204 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534024 kB' 'Mapped: 184308 kB' 'Shmem: 8785464 kB' 'KReclaimable: 249616 kB' 'Slab: 766504 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516888 kB' 'KernelStack: 22288 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10614732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213672 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.540 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.540 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.541 20:51:32 -- setup/common.sh@33 -- # echo 0 00:04:16.541 20:51:32 -- setup/common.sh@33 -- # return 0 00:04:16.541 20:51:32 -- setup/hugepages.sh@97 -- # anon=0 00:04:16.541 20:51:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:16.541 20:51:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.541 20:51:32 -- setup/common.sh@18 -- # local node= 00:04:16.541 20:51:32 -- setup/common.sh@19 -- # local var val 00:04:16.541 20:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.541 20:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.541 20:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.541 20:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.541 20:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.541 20:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41442632 kB' 'MemAvailable: 45051020 kB' 'Buffers: 10416 kB' 'Cached: 12755772 kB' 'SwapCached: 0 kB' 'Active: 9963780 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298688 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534556 kB' 'Mapped: 184296 kB' 'Shmem: 8785468 kB' 'KReclaimable: 249616 kB' 'Slab: 766424 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516808 kB' 'KernelStack: 22288 kB' 'PageTables: 9020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10616272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213576 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.541 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.541 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.542 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.542 20:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.543 20:51:32 -- setup/common.sh@33 -- # echo 0 00:04:16.543 20:51:32 -- setup/common.sh@33 -- # return 0 00:04:16.543 20:51:32 -- setup/hugepages.sh@99 -- # surp=0 00:04:16.543 20:51:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:16.543 20:51:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:16.543 20:51:32 -- setup/common.sh@18 -- # local node= 00:04:16.543 20:51:32 -- setup/common.sh@19 -- # local var val 00:04:16.543 20:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.543 20:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.543 20:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.543 20:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.543 20:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.543 20:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41441656 kB' 'MemAvailable: 45050044 kB' 'Buffers: 10416 kB' 'Cached: 12755796 kB' 'SwapCached: 0 kB' 'Active: 9963592 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298500 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534292 kB' 'Mapped: 184360 kB' 'Shmem: 8785492 kB' 'KReclaimable: 249616 kB' 'Slab: 766492 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516876 kB' 'KernelStack: 22160 kB' 'PageTables: 8780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10616500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213592 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.543 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.543 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.544 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.544 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.545 20:51:32 -- setup/common.sh@33 -- # echo 0 00:04:16.545 20:51:32 -- setup/common.sh@33 -- # return 0 00:04:16.545 20:51:32 -- setup/hugepages.sh@100 -- # resv=0 00:04:16.545 20:51:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:16.545 nr_hugepages=1024 00:04:16.545 20:51:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:16.545 resv_hugepages=0 00:04:16.545 20:51:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:16.545 surplus_hugepages=0 00:04:16.545 20:51:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:16.545 anon_hugepages=0 00:04:16.545 20:51:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.545 20:51:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:16.545 20:51:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:16.545 20:51:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:16.545 20:51:32 -- setup/common.sh@18 -- # local node= 00:04:16.545 20:51:32 -- setup/common.sh@19 -- # local var val 00:04:16.545 20:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.545 20:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.545 20:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.545 20:51:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.545 20:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.545 20:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41450348 kB' 'MemAvailable: 45058736 kB' 'Buffers: 10416 kB' 'Cached: 12755808 kB' 'SwapCached: 0 kB' 'Active: 9963832 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298740 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534604 kB' 'Mapped: 184368 kB' 'Shmem: 8785504 kB' 'KReclaimable: 249616 kB' 'Slab: 766556 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516940 kB' 'KernelStack: 22160 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10616672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213624 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.545 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.545 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.546 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.546 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.807 20:51:32 -- setup/common.sh@33 -- # echo 1024 00:04:16.807 20:51:32 -- setup/common.sh@33 -- # return 0 00:04:16.807 20:51:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.807 20:51:32 -- setup/hugepages.sh@112 -- # get_nodes 00:04:16.807 20:51:32 -- setup/hugepages.sh@27 -- # local node 00:04:16.807 20:51:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.807 20:51:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:16.807 20:51:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.807 20:51:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:16.807 20:51:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:16.807 20:51:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:16.807 20:51:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.807 20:51:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.807 20:51:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:16.807 20:51:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.807 20:51:32 -- setup/common.sh@18 -- # local node=0 00:04:16.807 20:51:32 -- setup/common.sh@19 -- # local var val 00:04:16.807 20:51:32 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.807 20:51:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.807 20:51:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:16.807 20:51:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:16.807 20:51:32 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.807 20:51:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 23868328 kB' 'MemUsed: 8770812 kB' 'SwapCached: 0 kB' 'Active: 4602132 kB' 'Inactive: 102364 kB' 'Active(anon): 4236636 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405268 kB' 'Mapped: 135048 kB' 'AnonPages: 302464 kB' 'Shmem: 3937408 kB' 'KernelStack: 12696 kB' 'PageTables: 5868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124064 kB' 'Slab: 400996 kB' 'SReclaimable: 124064 kB' 'SUnreclaim: 276932 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.807 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.807 20:51:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # continue 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.808 20:51:32 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.808 20:51:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.808 20:51:32 -- setup/common.sh@33 -- # echo 0 00:04:16.808 20:51:32 -- setup/common.sh@33 -- # return 0 00:04:16.808 20:51:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.808 20:51:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.808 20:51:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.808 20:51:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.808 20:51:32 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:16.808 node0=1024 expecting 1024 00:04:16.808 20:51:32 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:16.808 00:04:16.808 real 0m4.961s 00:04:16.808 user 0m1.242s 00:04:16.808 sys 0m2.187s 00:04:16.808 20:51:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:16.808 20:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:16.808 ************************************ 00:04:16.808 END TEST default_setup 00:04:16.808 ************************************ 00:04:16.808 20:51:32 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:16.808 20:51:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:16.808 20:51:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:16.808 20:51:32 -- common/autotest_common.sh@10 -- # set +x 00:04:16.808 ************************************ 00:04:16.808 START TEST per_node_1G_alloc 00:04:16.808 ************************************ 00:04:16.808 20:51:32 -- common/autotest_common.sh@1111 -- # per_node_1G_alloc 00:04:16.808 20:51:32 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:16.808 20:51:32 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:16.808 20:51:32 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:16.808 20:51:32 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:16.808 20:51:32 -- setup/hugepages.sh@51 -- # shift 00:04:16.808 20:51:32 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:16.808 20:51:32 -- setup/hugepages.sh@52 -- # local node_ids 00:04:16.808 20:51:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:16.808 20:51:32 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:16.808 20:51:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:16.808 20:51:32 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:16.809 20:51:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.809 20:51:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:16.809 20:51:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.809 20:51:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.809 20:51:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.809 20:51:32 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:16.809 20:51:32 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:16.809 20:51:32 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:16.809 20:51:32 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:16.809 20:51:32 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:16.809 20:51:32 -- setup/hugepages.sh@73 -- # return 0 00:04:16.809 20:51:32 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:16.809 20:51:32 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:16.809 20:51:32 -- setup/hugepages.sh@146 -- # setup output 00:04:16.809 20:51:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.809 20:51:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.101 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.101 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:20.101 20:51:35 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:20.101 20:51:35 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:20.101 20:51:35 -- setup/hugepages.sh@89 -- # local node 00:04:20.101 20:51:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.101 20:51:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.101 20:51:35 -- setup/hugepages.sh@92 -- # local surp 00:04:20.101 20:51:35 -- setup/hugepages.sh@93 -- # local resv 00:04:20.101 20:51:35 -- setup/hugepages.sh@94 -- # local anon 00:04:20.102 20:51:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.102 20:51:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.102 20:51:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.102 20:51:35 -- setup/common.sh@18 -- # local node= 00:04:20.102 20:51:35 -- setup/common.sh@19 -- # local var val 00:04:20.102 20:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.102 20:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.102 20:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.102 20:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.102 20:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.102 20:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41447000 kB' 'MemAvailable: 45055388 kB' 'Buffers: 10416 kB' 'Cached: 12755900 kB' 'SwapCached: 0 kB' 'Active: 9962612 kB' 'Inactive: 3333564 kB' 'Active(anon): 9297520 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532980 kB' 'Mapped: 183368 kB' 'Shmem: 8785596 kB' 'KReclaimable: 249616 kB' 'Slab: 766880 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 517264 kB' 'KernelStack: 22048 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213736 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.102 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.102 20:51:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.103 20:51:35 -- setup/common.sh@33 -- # echo 0 00:04:20.103 20:51:35 -- setup/common.sh@33 -- # return 0 00:04:20.103 20:51:35 -- setup/hugepages.sh@97 -- # anon=0 00:04:20.103 20:51:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.103 20:51:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.103 20:51:35 -- setup/common.sh@18 -- # local node= 00:04:20.103 20:51:35 -- setup/common.sh@19 -- # local var val 00:04:20.103 20:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.103 20:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.103 20:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.103 20:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.103 20:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.103 20:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41446988 kB' 'MemAvailable: 45055376 kB' 'Buffers: 10416 kB' 'Cached: 12755904 kB' 'SwapCached: 0 kB' 'Active: 9962292 kB' 'Inactive: 3333564 kB' 'Active(anon): 9297200 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532712 kB' 'Mapped: 183348 kB' 'Shmem: 8785600 kB' 'KReclaimable: 249616 kB' 'Slab: 766888 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 517272 kB' 'KernelStack: 22048 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213704 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.103 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.103 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.104 20:51:35 -- setup/common.sh@33 -- # echo 0 00:04:20.104 20:51:35 -- setup/common.sh@33 -- # return 0 00:04:20.104 20:51:35 -- setup/hugepages.sh@99 -- # surp=0 00:04:20.104 20:51:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.104 20:51:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.104 20:51:35 -- setup/common.sh@18 -- # local node= 00:04:20.104 20:51:35 -- setup/common.sh@19 -- # local var val 00:04:20.104 20:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.104 20:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.104 20:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.104 20:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.104 20:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.104 20:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41447000 kB' 'MemAvailable: 45055388 kB' 'Buffers: 10416 kB' 'Cached: 12755916 kB' 'SwapCached: 0 kB' 'Active: 9962336 kB' 'Inactive: 3333564 kB' 'Active(anon): 9297244 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532724 kB' 'Mapped: 183348 kB' 'Shmem: 8785612 kB' 'KReclaimable: 249616 kB' 'Slab: 766888 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 517272 kB' 'KernelStack: 22048 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607252 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213704 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.104 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.104 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.367 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.367 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.368 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.368 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.368 20:51:35 -- setup/common.sh@33 -- # echo 0 00:04:20.368 20:51:35 -- setup/common.sh@33 -- # return 0 00:04:20.368 20:51:35 -- setup/hugepages.sh@100 -- # resv=0 00:04:20.368 20:51:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.368 nr_hugepages=1024 00:04:20.368 20:51:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.368 resv_hugepages=0 00:04:20.368 20:51:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.368 surplus_hugepages=0 00:04:20.368 20:51:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.368 anon_hugepages=0 00:04:20.368 20:51:35 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.368 20:51:35 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.368 20:51:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.368 20:51:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.368 20:51:35 -- setup/common.sh@18 -- # local node= 00:04:20.368 20:51:35 -- setup/common.sh@19 -- # local var val 00:04:20.368 20:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.368 20:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.369 20:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.369 20:51:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.369 20:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.369 20:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41446816 kB' 'MemAvailable: 45055204 kB' 'Buffers: 10416 kB' 'Cached: 12755932 kB' 'SwapCached: 0 kB' 'Active: 9962320 kB' 'Inactive: 3333564 kB' 'Active(anon): 9297228 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532720 kB' 'Mapped: 183348 kB' 'Shmem: 8785628 kB' 'KReclaimable: 249616 kB' 'Slab: 766888 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 517272 kB' 'KernelStack: 22048 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213704 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.369 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.369 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.370 20:51:35 -- setup/common.sh@33 -- # echo 1024 00:04:20.370 20:51:35 -- setup/common.sh@33 -- # return 0 00:04:20.370 20:51:35 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.370 20:51:35 -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.370 20:51:35 -- setup/hugepages.sh@27 -- # local node 00:04:20.370 20:51:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.370 20:51:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:20.370 20:51:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.370 20:51:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:20.370 20:51:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.370 20:51:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.370 20:51:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.370 20:51:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.370 20:51:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.370 20:51:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.370 20:51:35 -- setup/common.sh@18 -- # local node=0 00:04:20.370 20:51:35 -- setup/common.sh@19 -- # local var val 00:04:20.370 20:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.370 20:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.370 20:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.370 20:51:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.370 20:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.370 20:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24920800 kB' 'MemUsed: 7718340 kB' 'SwapCached: 0 kB' 'Active: 4601648 kB' 'Inactive: 102364 kB' 'Active(anon): 4236152 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405336 kB' 'Mapped: 134048 kB' 'AnonPages: 301740 kB' 'Shmem: 3937476 kB' 'KernelStack: 12616 kB' 'PageTables: 5632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124064 kB' 'Slab: 401240 kB' 'SReclaimable: 124064 kB' 'SUnreclaim: 277176 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.370 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.370 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@33 -- # echo 0 00:04:20.371 20:51:35 -- setup/common.sh@33 -- # return 0 00:04:20.371 20:51:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.371 20:51:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.371 20:51:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.371 20:51:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:20.371 20:51:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.371 20:51:35 -- setup/common.sh@18 -- # local node=1 00:04:20.371 20:51:35 -- setup/common.sh@19 -- # local var val 00:04:20.371 20:51:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.371 20:51:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.371 20:51:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:20.371 20:51:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:20.371 20:51:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.371 20:51:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16526584 kB' 'MemUsed: 11129496 kB' 'SwapCached: 0 kB' 'Active: 5360544 kB' 'Inactive: 3231200 kB' 'Active(anon): 5060948 kB' 'Inactive(anon): 17936 kB' 'Active(file): 299596 kB' 'Inactive(file): 3213264 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8361024 kB' 'Mapped: 49300 kB' 'AnonPages: 230784 kB' 'Shmem: 4848164 kB' 'KernelStack: 9416 kB' 'PageTables: 2496 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125552 kB' 'Slab: 365648 kB' 'SReclaimable: 125552 kB' 'SUnreclaim: 240096 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.371 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.371 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # continue 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.372 20:51:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.372 20:51:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.372 20:51:35 -- setup/common.sh@33 -- # echo 0 00:04:20.372 20:51:35 -- setup/common.sh@33 -- # return 0 00:04:20.372 20:51:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.372 20:51:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.372 20:51:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.372 20:51:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.372 20:51:35 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:20.372 node0=512 expecting 512 00:04:20.372 20:51:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.372 20:51:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.372 20:51:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.372 20:51:35 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:20.372 node1=512 expecting 512 00:04:20.372 20:51:35 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:20.372 00:04:20.372 real 0m3.441s 00:04:20.372 user 0m1.293s 00:04:20.372 sys 0m2.205s 00:04:20.372 20:51:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:20.372 20:51:35 -- common/autotest_common.sh@10 -- # set +x 00:04:20.372 ************************************ 00:04:20.372 END TEST per_node_1G_alloc 00:04:20.372 ************************************ 00:04:20.372 20:51:35 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:20.372 20:51:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:20.372 20:51:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:20.372 20:51:35 -- common/autotest_common.sh@10 -- # set +x 00:04:20.633 ************************************ 00:04:20.633 START TEST even_2G_alloc 00:04:20.633 ************************************ 00:04:20.633 20:51:36 -- common/autotest_common.sh@1111 -- # even_2G_alloc 00:04:20.633 20:51:36 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:20.633 20:51:36 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:20.633 20:51:36 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:20.633 20:51:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:20.633 20:51:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:20.633 20:51:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.633 20:51:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:20.633 20:51:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.633 20:51:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.633 20:51:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.633 20:51:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:20.633 20:51:36 -- setup/hugepages.sh@83 -- # : 512 00:04:20.633 20:51:36 -- setup/hugepages.sh@84 -- # : 1 00:04:20.633 20:51:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:20.633 20:51:36 -- setup/hugepages.sh@83 -- # : 0 00:04:20.633 20:51:36 -- setup/hugepages.sh@84 -- # : 0 00:04:20.633 20:51:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.633 20:51:36 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:20.633 20:51:36 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:20.633 20:51:36 -- setup/hugepages.sh@153 -- # setup output 00:04:20.633 20:51:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.633 20:51:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:23.928 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.928 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:23.928 20:51:39 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:23.928 20:51:39 -- setup/hugepages.sh@89 -- # local node 00:04:23.928 20:51:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:23.928 20:51:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:23.928 20:51:39 -- setup/hugepages.sh@92 -- # local surp 00:04:23.928 20:51:39 -- setup/hugepages.sh@93 -- # local resv 00:04:23.928 20:51:39 -- setup/hugepages.sh@94 -- # local anon 00:04:23.928 20:51:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:23.928 20:51:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:23.928 20:51:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:23.928 20:51:39 -- setup/common.sh@18 -- # local node= 00:04:23.928 20:51:39 -- setup/common.sh@19 -- # local var val 00:04:23.928 20:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.928 20:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.928 20:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.928 20:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.928 20:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.928 20:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41474544 kB' 'MemAvailable: 45082932 kB' 'Buffers: 10416 kB' 'Cached: 12756028 kB' 'SwapCached: 0 kB' 'Active: 9963632 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298540 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533908 kB' 'Mapped: 183400 kB' 'Shmem: 8785724 kB' 'KReclaimable: 249616 kB' 'Slab: 766396 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516780 kB' 'KernelStack: 22048 kB' 'PageTables: 8212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10608836 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213640 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.928 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.928 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.929 20:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.929 20:51:39 -- setup/common.sh@33 -- # echo 0 00:04:23.929 20:51:39 -- setup/common.sh@33 -- # return 0 00:04:23.929 20:51:39 -- setup/hugepages.sh@97 -- # anon=0 00:04:23.929 20:51:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:23.929 20:51:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.929 20:51:39 -- setup/common.sh@18 -- # local node= 00:04:23.929 20:51:39 -- setup/common.sh@19 -- # local var val 00:04:23.929 20:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.929 20:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.929 20:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.929 20:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.929 20:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.929 20:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.929 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41471404 kB' 'MemAvailable: 45079792 kB' 'Buffers: 10416 kB' 'Cached: 12756028 kB' 'SwapCached: 0 kB' 'Active: 9966612 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301520 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536844 kB' 'Mapped: 183884 kB' 'Shmem: 8785724 kB' 'KReclaimable: 249616 kB' 'Slab: 766396 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516780 kB' 'KernelStack: 21984 kB' 'PageTables: 8004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10612016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213608 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.930 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.930 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.931 20:51:39 -- setup/common.sh@33 -- # echo 0 00:04:23.931 20:51:39 -- setup/common.sh@33 -- # return 0 00:04:23.931 20:51:39 -- setup/hugepages.sh@99 -- # surp=0 00:04:23.931 20:51:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:23.931 20:51:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:23.931 20:51:39 -- setup/common.sh@18 -- # local node= 00:04:23.931 20:51:39 -- setup/common.sh@19 -- # local var val 00:04:23.931 20:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.931 20:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.931 20:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.931 20:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.931 20:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.931 20:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41474908 kB' 'MemAvailable: 45083296 kB' 'Buffers: 10416 kB' 'Cached: 12756040 kB' 'SwapCached: 0 kB' 'Active: 9963244 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298152 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533484 kB' 'Mapped: 183716 kB' 'Shmem: 8785736 kB' 'KReclaimable: 249616 kB' 'Slab: 766372 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516756 kB' 'KernelStack: 22016 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213624 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.931 20:51:39 -- setup/common.sh@32 -- # continue 00:04:23.931 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.193 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.193 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:24.194 20:51:39 -- setup/common.sh@33 -- # echo 0 00:04:24.194 20:51:39 -- setup/common.sh@33 -- # return 0 00:04:24.194 20:51:39 -- setup/hugepages.sh@100 -- # resv=0 00:04:24.194 20:51:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:24.194 nr_hugepages=1024 00:04:24.194 20:51:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:24.194 resv_hugepages=0 00:04:24.194 20:51:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:24.194 surplus_hugepages=0 00:04:24.194 20:51:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:24.194 anon_hugepages=0 00:04:24.194 20:51:39 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.194 20:51:39 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:24.194 20:51:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:24.194 20:51:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:24.194 20:51:39 -- setup/common.sh@18 -- # local node= 00:04:24.194 20:51:39 -- setup/common.sh@19 -- # local var val 00:04:24.194 20:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.194 20:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.194 20:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.194 20:51:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.194 20:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.194 20:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.194 20:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41475640 kB' 'MemAvailable: 45084028 kB' 'Buffers: 10416 kB' 'Cached: 12756068 kB' 'SwapCached: 0 kB' 'Active: 9962932 kB' 'Inactive: 3333564 kB' 'Active(anon): 9297840 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533172 kB' 'Mapped: 183380 kB' 'Shmem: 8785764 kB' 'KReclaimable: 249616 kB' 'Slab: 766340 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516724 kB' 'KernelStack: 22032 kB' 'PageTables: 8152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213624 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.194 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.194 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.195 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:24.195 20:51:39 -- setup/common.sh@33 -- # echo 1024 00:04:24.195 20:51:39 -- setup/common.sh@33 -- # return 0 00:04:24.195 20:51:39 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:24.195 20:51:39 -- setup/hugepages.sh@112 -- # get_nodes 00:04:24.195 20:51:39 -- setup/hugepages.sh@27 -- # local node 00:04:24.195 20:51:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.195 20:51:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.195 20:51:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.195 20:51:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:24.195 20:51:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.195 20:51:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.195 20:51:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.195 20:51:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.195 20:51:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:24.195 20:51:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.195 20:51:39 -- setup/common.sh@18 -- # local node=0 00:04:24.195 20:51:39 -- setup/common.sh@19 -- # local var val 00:04:24.195 20:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.195 20:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.195 20:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:24.195 20:51:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:24.195 20:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.195 20:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.195 20:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24932984 kB' 'MemUsed: 7706156 kB' 'SwapCached: 0 kB' 'Active: 4601668 kB' 'Inactive: 102364 kB' 'Active(anon): 4236172 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405368 kB' 'Mapped: 134076 kB' 'AnonPages: 301760 kB' 'Shmem: 3937508 kB' 'KernelStack: 12616 kB' 'PageTables: 5660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124064 kB' 'Slab: 401188 kB' 'SReclaimable: 124064 kB' 'SUnreclaim: 277124 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.195 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.196 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.196 20:51:39 -- setup/common.sh@33 -- # echo 0 00:04:24.196 20:51:39 -- setup/common.sh@33 -- # return 0 00:04:24.196 20:51:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.196 20:51:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:24.196 20:51:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:24.196 20:51:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:24.196 20:51:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:24.196 20:51:39 -- setup/common.sh@18 -- # local node=1 00:04:24.196 20:51:39 -- setup/common.sh@19 -- # local var val 00:04:24.196 20:51:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:24.196 20:51:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.196 20:51:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:24.196 20:51:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:24.196 20:51:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.196 20:51:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.196 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16541908 kB' 'MemUsed: 11114172 kB' 'SwapCached: 0 kB' 'Active: 5361432 kB' 'Inactive: 3231200 kB' 'Active(anon): 5061836 kB' 'Inactive(anon): 17936 kB' 'Active(file): 299596 kB' 'Inactive(file): 3213264 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8361132 kB' 'Mapped: 49304 kB' 'AnonPages: 231616 kB' 'Shmem: 4848272 kB' 'KernelStack: 9416 kB' 'PageTables: 2512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125552 kB' 'Slab: 365152 kB' 'SReclaimable: 125552 kB' 'SUnreclaim: 239600 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.197 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.197 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # continue 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:24.198 20:51:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:24.198 20:51:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:24.198 20:51:39 -- setup/common.sh@33 -- # echo 0 00:04:24.198 20:51:39 -- setup/common.sh@33 -- # return 0 00:04:24.198 20:51:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:24.198 20:51:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.198 20:51:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.198 20:51:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.198 20:51:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:24.198 node0=512 expecting 512 00:04:24.198 20:51:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:24.198 20:51:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:24.198 20:51:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:24.198 20:51:39 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:24.198 node1=512 expecting 512 00:04:24.198 20:51:39 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:24.198 00:04:24.198 real 0m3.651s 00:04:24.198 user 0m1.408s 00:04:24.198 sys 0m2.304s 00:04:24.198 20:51:39 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:24.198 20:51:39 -- common/autotest_common.sh@10 -- # set +x 00:04:24.198 ************************************ 00:04:24.198 END TEST even_2G_alloc 00:04:24.198 ************************************ 00:04:24.198 20:51:39 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:24.198 20:51:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:24.198 20:51:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:24.198 20:51:39 -- common/autotest_common.sh@10 -- # set +x 00:04:24.457 ************************************ 00:04:24.457 START TEST odd_alloc 00:04:24.457 ************************************ 00:04:24.457 20:51:39 -- common/autotest_common.sh@1111 -- # odd_alloc 00:04:24.457 20:51:39 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:24.457 20:51:39 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:24.457 20:51:39 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:24.457 20:51:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:24.457 20:51:39 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:24.457 20:51:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:24.457 20:51:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:24.457 20:51:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:24.457 20:51:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:24.457 20:51:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:24.457 20:51:39 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:24.457 20:51:39 -- setup/hugepages.sh@83 -- # : 513 00:04:24.457 20:51:39 -- setup/hugepages.sh@84 -- # : 1 00:04:24.457 20:51:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:24.457 20:51:39 -- setup/hugepages.sh@83 -- # : 0 00:04:24.457 20:51:39 -- setup/hugepages.sh@84 -- # : 0 00:04:24.457 20:51:39 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:24.457 20:51:39 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:24.457 20:51:39 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:24.457 20:51:39 -- setup/hugepages.sh@160 -- # setup output 00:04:24.457 20:51:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.457 20:51:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:27.758 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.758 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:27.758 20:51:43 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:27.758 20:51:43 -- setup/hugepages.sh@89 -- # local node 00:04:27.758 20:51:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:27.758 20:51:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:27.758 20:51:43 -- setup/hugepages.sh@92 -- # local surp 00:04:27.758 20:51:43 -- setup/hugepages.sh@93 -- # local resv 00:04:27.758 20:51:43 -- setup/hugepages.sh@94 -- # local anon 00:04:27.758 20:51:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:27.758 20:51:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:27.758 20:51:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:27.758 20:51:43 -- setup/common.sh@18 -- # local node= 00:04:27.758 20:51:43 -- setup/common.sh@19 -- # local var val 00:04:27.758 20:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.758 20:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.758 20:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.758 20:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.758 20:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.758 20:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.758 20:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41489420 kB' 'MemAvailable: 45097808 kB' 'Buffers: 10416 kB' 'Cached: 12756160 kB' 'SwapCached: 0 kB' 'Active: 9964756 kB' 'Inactive: 3333564 kB' 'Active(anon): 9299664 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534228 kB' 'Mapped: 183508 kB' 'Shmem: 8785856 kB' 'KReclaimable: 249616 kB' 'Slab: 766596 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516980 kB' 'KernelStack: 22176 kB' 'PageTables: 8800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10608144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213800 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.758 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.758 20:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.759 20:51:43 -- setup/common.sh@33 -- # echo 0 00:04:27.759 20:51:43 -- setup/common.sh@33 -- # return 0 00:04:27.759 20:51:43 -- setup/hugepages.sh@97 -- # anon=0 00:04:27.759 20:51:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:27.759 20:51:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.759 20:51:43 -- setup/common.sh@18 -- # local node= 00:04:27.759 20:51:43 -- setup/common.sh@19 -- # local var val 00:04:27.759 20:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.759 20:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.759 20:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.759 20:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.759 20:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.759 20:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41490132 kB' 'MemAvailable: 45098520 kB' 'Buffers: 10416 kB' 'Cached: 12756160 kB' 'SwapCached: 0 kB' 'Active: 9964072 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298980 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533920 kB' 'Mapped: 183504 kB' 'Shmem: 8785856 kB' 'KReclaimable: 249616 kB' 'Slab: 766596 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516980 kB' 'KernelStack: 22192 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10608156 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213752 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.759 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.759 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.760 20:51:43 -- setup/common.sh@33 -- # echo 0 00:04:27.760 20:51:43 -- setup/common.sh@33 -- # return 0 00:04:27.760 20:51:43 -- setup/hugepages.sh@99 -- # surp=0 00:04:27.760 20:51:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:27.760 20:51:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:27.760 20:51:43 -- setup/common.sh@18 -- # local node= 00:04:27.760 20:51:43 -- setup/common.sh@19 -- # local var val 00:04:27.760 20:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.760 20:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.760 20:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.760 20:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.760 20:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.760 20:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41493584 kB' 'MemAvailable: 45101972 kB' 'Buffers: 10416 kB' 'Cached: 12756184 kB' 'SwapCached: 0 kB' 'Active: 9964312 kB' 'Inactive: 3333564 kB' 'Active(anon): 9299220 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534692 kB' 'Mapped: 183436 kB' 'Shmem: 8785880 kB' 'KReclaimable: 249616 kB' 'Slab: 766844 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 517228 kB' 'KernelStack: 22000 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10606668 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213736 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.760 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.760 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.761 20:51:43 -- setup/common.sh@33 -- # echo 0 00:04:27.761 20:51:43 -- setup/common.sh@33 -- # return 0 00:04:27.761 20:51:43 -- setup/hugepages.sh@100 -- # resv=0 00:04:27.761 20:51:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:27.761 nr_hugepages=1025 00:04:27.761 20:51:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:27.761 resv_hugepages=0 00:04:27.761 20:51:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:27.761 surplus_hugepages=0 00:04:27.761 20:51:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:27.761 anon_hugepages=0 00:04:27.761 20:51:43 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:27.761 20:51:43 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:27.761 20:51:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:27.761 20:51:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:27.761 20:51:43 -- setup/common.sh@18 -- # local node= 00:04:27.761 20:51:43 -- setup/common.sh@19 -- # local var val 00:04:27.761 20:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.761 20:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.761 20:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.761 20:51:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.761 20:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.761 20:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41499852 kB' 'MemAvailable: 45108240 kB' 'Buffers: 10416 kB' 'Cached: 12756192 kB' 'SwapCached: 0 kB' 'Active: 9964052 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298960 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534532 kB' 'Mapped: 183436 kB' 'Shmem: 8785888 kB' 'KReclaimable: 249616 kB' 'Slab: 766844 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 517228 kB' 'KernelStack: 22128 kB' 'PageTables: 8280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486612 kB' 'Committed_AS: 10607948 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213608 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.761 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.761 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.762 20:51:43 -- setup/common.sh@33 -- # echo 1025 00:04:27.762 20:51:43 -- setup/common.sh@33 -- # return 0 00:04:27.762 20:51:43 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:27.762 20:51:43 -- setup/hugepages.sh@112 -- # get_nodes 00:04:27.762 20:51:43 -- setup/hugepages.sh@27 -- # local node 00:04:27.762 20:51:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.762 20:51:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:27.762 20:51:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.762 20:51:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:27.762 20:51:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:27.762 20:51:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:27.762 20:51:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.762 20:51:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.762 20:51:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:27.762 20:51:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.762 20:51:43 -- setup/common.sh@18 -- # local node=0 00:04:27.762 20:51:43 -- setup/common.sh@19 -- # local var val 00:04:27.762 20:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.762 20:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.762 20:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:27.762 20:51:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:27.762 20:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.762 20:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24939248 kB' 'MemUsed: 7699892 kB' 'SwapCached: 0 kB' 'Active: 4606888 kB' 'Inactive: 102364 kB' 'Active(anon): 4241392 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405444 kB' 'Mapped: 134104 kB' 'AnonPages: 307220 kB' 'Shmem: 3937584 kB' 'KernelStack: 12824 kB' 'PageTables: 6368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124064 kB' 'Slab: 401984 kB' 'SReclaimable: 124064 kB' 'SUnreclaim: 277920 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.762 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.762 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.762 20:51:43 -- setup/common.sh@33 -- # echo 0 00:04:27.762 20:51:43 -- setup/common.sh@33 -- # return 0 00:04:27.762 20:51:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.763 20:51:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.763 20:51:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.763 20:51:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:27.763 20:51:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.763 20:51:43 -- setup/common.sh@18 -- # local node=1 00:04:27.763 20:51:43 -- setup/common.sh@19 -- # local var val 00:04:27.763 20:51:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.763 20:51:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.763 20:51:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:27.763 20:51:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:27.763 20:51:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.763 20:51:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 16563416 kB' 'MemUsed: 11092664 kB' 'SwapCached: 0 kB' 'Active: 5357872 kB' 'Inactive: 3231200 kB' 'Active(anon): 5058276 kB' 'Inactive(anon): 17936 kB' 'Active(file): 299596 kB' 'Inactive(file): 3213264 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8361164 kB' 'Mapped: 49332 kB' 'AnonPages: 227976 kB' 'Shmem: 4848304 kB' 'KernelStack: 9384 kB' 'PageTables: 2380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125552 kB' 'Slab: 364836 kB' 'SReclaimable: 125552 kB' 'SUnreclaim: 239284 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # continue 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.763 20:51:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.763 20:51:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.763 20:51:43 -- setup/common.sh@33 -- # echo 0 00:04:27.763 20:51:43 -- setup/common.sh@33 -- # return 0 00:04:27.763 20:51:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.763 20:51:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.763 20:51:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.763 20:51:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.763 20:51:43 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:27.763 node0=512 expecting 513 00:04:27.763 20:51:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.763 20:51:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.763 20:51:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.763 20:51:43 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:27.763 node1=513 expecting 512 00:04:27.763 20:51:43 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:27.763 00:04:27.763 real 0m3.389s 00:04:27.763 user 0m1.353s 00:04:27.763 sys 0m2.093s 00:04:27.763 20:51:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:27.763 20:51:43 -- common/autotest_common.sh@10 -- # set +x 00:04:27.763 ************************************ 00:04:27.763 END TEST odd_alloc 00:04:27.763 ************************************ 00:04:27.763 20:51:43 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:27.763 20:51:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:27.763 20:51:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:27.763 20:51:43 -- common/autotest_common.sh@10 -- # set +x 00:04:28.022 ************************************ 00:04:28.022 START TEST custom_alloc 00:04:28.022 ************************************ 00:04:28.022 20:51:43 -- common/autotest_common.sh@1111 -- # custom_alloc 00:04:28.022 20:51:43 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:28.022 20:51:43 -- setup/hugepages.sh@169 -- # local node 00:04:28.022 20:51:43 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:28.022 20:51:43 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:28.022 20:51:43 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:28.022 20:51:43 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:28.022 20:51:43 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:28.022 20:51:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:28.022 20:51:43 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:28.022 20:51:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:28.022 20:51:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:28.022 20:51:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:28.022 20:51:43 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:28.022 20:51:43 -- setup/hugepages.sh@83 -- # : 256 00:04:28.022 20:51:43 -- setup/hugepages.sh@84 -- # : 1 00:04:28.022 20:51:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:28.022 20:51:43 -- setup/hugepages.sh@83 -- # : 0 00:04:28.022 20:51:43 -- setup/hugepages.sh@84 -- # : 0 00:04:28.022 20:51:43 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:28.022 20:51:43 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:28.022 20:51:43 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:28.022 20:51:43 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:28.022 20:51:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:28.022 20:51:43 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:28.022 20:51:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:28.022 20:51:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:28.022 20:51:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:28.022 20:51:43 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:28.022 20:51:43 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:28.022 20:51:43 -- setup/hugepages.sh@78 -- # return 0 00:04:28.022 20:51:43 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:28.022 20:51:43 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:28.022 20:51:43 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:28.022 20:51:43 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:28.022 20:51:43 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:28.022 20:51:43 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:28.022 20:51:43 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:28.022 20:51:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:28.022 20:51:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:28.022 20:51:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:28.022 20:51:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:28.022 20:51:43 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:28.022 20:51:43 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:28.022 20:51:43 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:28.022 20:51:43 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:28.022 20:51:43 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:28.022 20:51:43 -- setup/hugepages.sh@78 -- # return 0 00:04:28.022 20:51:43 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:28.022 20:51:43 -- setup/hugepages.sh@187 -- # setup output 00:04:28.022 20:51:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.022 20:51:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:31.312 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:31.312 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:31.312 20:51:46 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:31.312 20:51:46 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:31.312 20:51:46 -- setup/hugepages.sh@89 -- # local node 00:04:31.312 20:51:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:31.312 20:51:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:31.312 20:51:46 -- setup/hugepages.sh@92 -- # local surp 00:04:31.312 20:51:46 -- setup/hugepages.sh@93 -- # local resv 00:04:31.312 20:51:46 -- setup/hugepages.sh@94 -- # local anon 00:04:31.312 20:51:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:31.313 20:51:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:31.313 20:51:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:31.313 20:51:46 -- setup/common.sh@18 -- # local node= 00:04:31.313 20:51:46 -- setup/common.sh@19 -- # local var val 00:04:31.313 20:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:31.313 20:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.313 20:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.313 20:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.313 20:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.313 20:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40472944 kB' 'MemAvailable: 44081332 kB' 'Buffers: 10416 kB' 'Cached: 12756292 kB' 'SwapCached: 0 kB' 'Active: 9967072 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301980 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536672 kB' 'Mapped: 184372 kB' 'Shmem: 8785988 kB' 'KReclaimable: 249616 kB' 'Slab: 766388 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516772 kB' 'KernelStack: 22032 kB' 'PageTables: 8168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10609472 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213688 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.313 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.313 20:51:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.314 20:51:46 -- setup/common.sh@33 -- # echo 0 00:04:31.314 20:51:46 -- setup/common.sh@33 -- # return 0 00:04:31.314 20:51:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:31.314 20:51:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:31.314 20:51:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.314 20:51:46 -- setup/common.sh@18 -- # local node= 00:04:31.314 20:51:46 -- setup/common.sh@19 -- # local var val 00:04:31.314 20:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:31.314 20:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.314 20:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.314 20:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.314 20:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.314 20:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40474044 kB' 'MemAvailable: 44082432 kB' 'Buffers: 10416 kB' 'Cached: 12756292 kB' 'SwapCached: 0 kB' 'Active: 9963856 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298764 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533940 kB' 'Mapped: 183796 kB' 'Shmem: 8785988 kB' 'KReclaimable: 249616 kB' 'Slab: 766388 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516772 kB' 'KernelStack: 22016 kB' 'PageTables: 8112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10606412 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213704 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.314 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.314 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.315 20:51:46 -- setup/common.sh@33 -- # echo 0 00:04:31.315 20:51:46 -- setup/common.sh@33 -- # return 0 00:04:31.315 20:51:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:31.315 20:51:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:31.315 20:51:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:31.315 20:51:46 -- setup/common.sh@18 -- # local node= 00:04:31.315 20:51:46 -- setup/common.sh@19 -- # local var val 00:04:31.315 20:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:31.315 20:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.315 20:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.315 20:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.315 20:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.315 20:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40474444 kB' 'MemAvailable: 44082832 kB' 'Buffers: 10416 kB' 'Cached: 12756304 kB' 'SwapCached: 0 kB' 'Active: 9963780 kB' 'Inactive: 3333564 kB' 'Active(anon): 9298688 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533848 kB' 'Mapped: 183448 kB' 'Shmem: 8786000 kB' 'KReclaimable: 249616 kB' 'Slab: 766388 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516772 kB' 'KernelStack: 22016 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10606424 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213704 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.315 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.315 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.316 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.316 20:51:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.317 20:51:46 -- setup/common.sh@33 -- # echo 0 00:04:31.317 20:51:46 -- setup/common.sh@33 -- # return 0 00:04:31.317 20:51:46 -- setup/hugepages.sh@100 -- # resv=0 00:04:31.317 20:51:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:31.317 nr_hugepages=1536 00:04:31.317 20:51:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:31.317 resv_hugepages=0 00:04:31.317 20:51:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:31.317 surplus_hugepages=0 00:04:31.317 20:51:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:31.317 anon_hugepages=0 00:04:31.317 20:51:46 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:31.317 20:51:46 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:31.317 20:51:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:31.317 20:51:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:31.317 20:51:46 -- setup/common.sh@18 -- # local node= 00:04:31.317 20:51:46 -- setup/common.sh@19 -- # local var val 00:04:31.317 20:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:31.317 20:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.317 20:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.317 20:51:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.317 20:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.317 20:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 40474444 kB' 'MemAvailable: 44082832 kB' 'Buffers: 10416 kB' 'Cached: 12756320 kB' 'SwapCached: 0 kB' 'Active: 9964148 kB' 'Inactive: 3333564 kB' 'Active(anon): 9299056 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534192 kB' 'Mapped: 183448 kB' 'Shmem: 8786016 kB' 'KReclaimable: 249616 kB' 'Slab: 766388 kB' 'SReclaimable: 249616 kB' 'SUnreclaim: 516772 kB' 'KernelStack: 22032 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963348 kB' 'Committed_AS: 10606440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213704 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.317 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.317 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.318 20:51:46 -- setup/common.sh@33 -- # echo 1536 00:04:31.318 20:51:46 -- setup/common.sh@33 -- # return 0 00:04:31.318 20:51:46 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:31.318 20:51:46 -- setup/hugepages.sh@112 -- # get_nodes 00:04:31.318 20:51:46 -- setup/hugepages.sh@27 -- # local node 00:04:31.318 20:51:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.318 20:51:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:31.318 20:51:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.318 20:51:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:31.318 20:51:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:31.318 20:51:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:31.318 20:51:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:31.318 20:51:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:31.318 20:51:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:31.318 20:51:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.318 20:51:46 -- setup/common.sh@18 -- # local node=0 00:04:31.318 20:51:46 -- setup/common.sh@19 -- # local var val 00:04:31.318 20:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:31.318 20:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.318 20:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:31.318 20:51:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:31.318 20:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.318 20:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 24965956 kB' 'MemUsed: 7673184 kB' 'SwapCached: 0 kB' 'Active: 4604740 kB' 'Inactive: 102364 kB' 'Active(anon): 4239244 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405528 kB' 'Mapped: 134128 kB' 'AnonPages: 304696 kB' 'Shmem: 3937668 kB' 'KernelStack: 12632 kB' 'PageTables: 5680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124064 kB' 'Slab: 401364 kB' 'SReclaimable: 124064 kB' 'SUnreclaim: 277300 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.318 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.318 20:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@33 -- # echo 0 00:04:31.319 20:51:46 -- setup/common.sh@33 -- # return 0 00:04:31.319 20:51:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:31.319 20:51:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:31.319 20:51:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:31.319 20:51:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:31.319 20:51:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.319 20:51:46 -- setup/common.sh@18 -- # local node=1 00:04:31.319 20:51:46 -- setup/common.sh@19 -- # local var val 00:04:31.319 20:51:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:31.319 20:51:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.319 20:51:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:31.319 20:51:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:31.319 20:51:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.319 20:51:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.319 20:51:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27656080 kB' 'MemFree: 15508992 kB' 'MemUsed: 12147088 kB' 'SwapCached: 0 kB' 'Active: 5359068 kB' 'Inactive: 3231200 kB' 'Active(anon): 5059472 kB' 'Inactive(anon): 17936 kB' 'Active(file): 299596 kB' 'Inactive(file): 3213264 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8361208 kB' 'Mapped: 49320 kB' 'AnonPages: 229156 kB' 'Shmem: 4848348 kB' 'KernelStack: 9384 kB' 'PageTables: 2412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125552 kB' 'Slab: 365024 kB' 'SReclaimable: 125552 kB' 'SUnreclaim: 239472 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.319 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.319 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # continue 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:31.320 20:51:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:31.320 20:51:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.320 20:51:46 -- setup/common.sh@33 -- # echo 0 00:04:31.320 20:51:46 -- setup/common.sh@33 -- # return 0 00:04:31.320 20:51:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:31.320 20:51:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:31.320 20:51:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:31.320 20:51:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:31.320 20:51:46 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:31.320 node0=512 expecting 512 00:04:31.320 20:51:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:31.320 20:51:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:31.320 20:51:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:31.320 20:51:46 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:31.320 node1=1024 expecting 1024 00:04:31.320 20:51:46 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:31.320 00:04:31.320 real 0m3.119s 00:04:31.320 user 0m1.085s 00:04:31.320 sys 0m2.020s 00:04:31.320 20:51:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:31.320 20:51:46 -- common/autotest_common.sh@10 -- # set +x 00:04:31.320 ************************************ 00:04:31.320 END TEST custom_alloc 00:04:31.320 ************************************ 00:04:31.320 20:51:46 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:31.320 20:51:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:31.320 20:51:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:31.320 20:51:46 -- common/autotest_common.sh@10 -- # set +x 00:04:31.321 ************************************ 00:04:31.321 START TEST no_shrink_alloc 00:04:31.321 ************************************ 00:04:31.321 20:51:46 -- common/autotest_common.sh@1111 -- # no_shrink_alloc 00:04:31.321 20:51:46 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:31.321 20:51:46 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:31.321 20:51:46 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:31.321 20:51:46 -- setup/hugepages.sh@51 -- # shift 00:04:31.321 20:51:46 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:31.321 20:51:46 -- setup/hugepages.sh@52 -- # local node_ids 00:04:31.321 20:51:46 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:31.321 20:51:46 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:31.321 20:51:46 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:31.321 20:51:46 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:31.321 20:51:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.321 20:51:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:31.321 20:51:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.321 20:51:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.321 20:51:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.321 20:51:46 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:31.321 20:51:46 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.321 20:51:46 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:31.321 20:51:46 -- setup/hugepages.sh@73 -- # return 0 00:04:31.321 20:51:46 -- setup/hugepages.sh@198 -- # setup output 00:04:31.321 20:51:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.321 20:51:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:34.613 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.613 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:34.613 20:51:50 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:34.613 20:51:50 -- setup/hugepages.sh@89 -- # local node 00:04:34.613 20:51:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:34.613 20:51:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:34.613 20:51:50 -- setup/hugepages.sh@92 -- # local surp 00:04:34.613 20:51:50 -- setup/hugepages.sh@93 -- # local resv 00:04:34.613 20:51:50 -- setup/hugepages.sh@94 -- # local anon 00:04:34.613 20:51:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:34.613 20:51:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:34.613 20:51:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:34.613 20:51:50 -- setup/common.sh@18 -- # local node= 00:04:34.613 20:51:50 -- setup/common.sh@19 -- # local var val 00:04:34.613 20:51:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.613 20:51:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.613 20:51:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.613 20:51:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.613 20:51:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.613 20:51:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41547792 kB' 'MemAvailable: 45156184 kB' 'Buffers: 10416 kB' 'Cached: 12756432 kB' 'SwapCached: 0 kB' 'Active: 9967464 kB' 'Inactive: 3333564 kB' 'Active(anon): 9302372 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537880 kB' 'Mapped: 183584 kB' 'Shmem: 8786128 kB' 'KReclaimable: 249624 kB' 'Slab: 765912 kB' 'SReclaimable: 249624 kB' 'SUnreclaim: 516288 kB' 'KernelStack: 22048 kB' 'PageTables: 8212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213672 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.613 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.613 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.614 20:51:50 -- setup/common.sh@33 -- # echo 0 00:04:34.614 20:51:50 -- setup/common.sh@33 -- # return 0 00:04:34.614 20:51:50 -- setup/hugepages.sh@97 -- # anon=0 00:04:34.614 20:51:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:34.614 20:51:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.614 20:51:50 -- setup/common.sh@18 -- # local node= 00:04:34.614 20:51:50 -- setup/common.sh@19 -- # local var val 00:04:34.614 20:51:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.614 20:51:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.614 20:51:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.614 20:51:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.614 20:51:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.614 20:51:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41547896 kB' 'MemAvailable: 45156288 kB' 'Buffers: 10416 kB' 'Cached: 12756436 kB' 'SwapCached: 0 kB' 'Active: 9966900 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301808 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537272 kB' 'Mapped: 183480 kB' 'Shmem: 8786132 kB' 'KReclaimable: 249624 kB' 'Slab: 765812 kB' 'SReclaimable: 249624 kB' 'SUnreclaim: 516188 kB' 'KernelStack: 22016 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607204 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213656 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.614 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.614 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.615 20:51:50 -- setup/common.sh@33 -- # echo 0 00:04:34.615 20:51:50 -- setup/common.sh@33 -- # return 0 00:04:34.615 20:51:50 -- setup/hugepages.sh@99 -- # surp=0 00:04:34.615 20:51:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:34.615 20:51:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:34.615 20:51:50 -- setup/common.sh@18 -- # local node= 00:04:34.615 20:51:50 -- setup/common.sh@19 -- # local var val 00:04:34.615 20:51:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.615 20:51:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.615 20:51:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.615 20:51:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.615 20:51:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.615 20:51:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.615 20:51:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41548404 kB' 'MemAvailable: 45156796 kB' 'Buffers: 10416 kB' 'Cached: 12756448 kB' 'SwapCached: 0 kB' 'Active: 9966824 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301732 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537200 kB' 'Mapped: 183480 kB' 'Shmem: 8786144 kB' 'KReclaimable: 249624 kB' 'Slab: 765812 kB' 'SReclaimable: 249624 kB' 'SUnreclaim: 516188 kB' 'KernelStack: 22016 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607220 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213656 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.615 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.615 20:51:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.616 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.616 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.617 20:51:50 -- setup/common.sh@33 -- # echo 0 00:04:34.617 20:51:50 -- setup/common.sh@33 -- # return 0 00:04:34.617 20:51:50 -- setup/hugepages.sh@100 -- # resv=0 00:04:34.617 20:51:50 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:34.617 nr_hugepages=1024 00:04:34.617 20:51:50 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:34.617 resv_hugepages=0 00:04:34.617 20:51:50 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:34.617 surplus_hugepages=0 00:04:34.617 20:51:50 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:34.617 anon_hugepages=0 00:04:34.617 20:51:50 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:34.617 20:51:50 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:34.617 20:51:50 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:34.617 20:51:50 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:34.617 20:51:50 -- setup/common.sh@18 -- # local node= 00:04:34.617 20:51:50 -- setup/common.sh@19 -- # local var val 00:04:34.617 20:51:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.617 20:51:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.617 20:51:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.617 20:51:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.617 20:51:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.617 20:51:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41548404 kB' 'MemAvailable: 45156796 kB' 'Buffers: 10416 kB' 'Cached: 12756460 kB' 'SwapCached: 0 kB' 'Active: 9966768 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301676 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537116 kB' 'Mapped: 183480 kB' 'Shmem: 8786156 kB' 'KReclaimable: 249624 kB' 'Slab: 765812 kB' 'SReclaimable: 249624 kB' 'SUnreclaim: 516188 kB' 'KernelStack: 22000 kB' 'PageTables: 8040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10607232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213672 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.617 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.617 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.618 20:51:50 -- setup/common.sh@33 -- # echo 1024 00:04:34.618 20:51:50 -- setup/common.sh@33 -- # return 0 00:04:34.618 20:51:50 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:34.618 20:51:50 -- setup/hugepages.sh@112 -- # get_nodes 00:04:34.618 20:51:50 -- setup/hugepages.sh@27 -- # local node 00:04:34.618 20:51:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.618 20:51:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:34.618 20:51:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.618 20:51:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:34.618 20:51:50 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:34.618 20:51:50 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:34.618 20:51:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.618 20:51:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.618 20:51:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:34.618 20:51:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.618 20:51:50 -- setup/common.sh@18 -- # local node=0 00:04:34.618 20:51:50 -- setup/common.sh@19 -- # local var val 00:04:34.618 20:51:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.618 20:51:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.618 20:51:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:34.618 20:51:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:34.618 20:51:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.618 20:51:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 23935636 kB' 'MemUsed: 8703504 kB' 'SwapCached: 0 kB' 'Active: 4608792 kB' 'Inactive: 102364 kB' 'Active(anon): 4243296 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405664 kB' 'Mapped: 134152 kB' 'AnonPages: 309060 kB' 'Shmem: 3937804 kB' 'KernelStack: 12648 kB' 'PageTables: 5776 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124072 kB' 'Slab: 401004 kB' 'SReclaimable: 124072 kB' 'SUnreclaim: 276932 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.618 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.618 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # continue 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.878 20:51:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.878 20:51:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.878 20:51:50 -- setup/common.sh@33 -- # echo 0 00:04:34.878 20:51:50 -- setup/common.sh@33 -- # return 0 00:04:34.878 20:51:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.878 20:51:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.878 20:51:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.878 20:51:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.878 20:51:50 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:34.878 node0=1024 expecting 1024 00:04:34.878 20:51:50 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:34.878 20:51:50 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:34.878 20:51:50 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:34.878 20:51:50 -- setup/hugepages.sh@202 -- # setup output 00:04:34.878 20:51:50 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.878 20:51:50 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:38.173 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:38.173 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:38.173 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:38.173 20:51:53 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:38.173 20:51:53 -- setup/hugepages.sh@89 -- # local node 00:04:38.173 20:51:53 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:38.173 20:51:53 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:38.173 20:51:53 -- setup/hugepages.sh@92 -- # local surp 00:04:38.173 20:51:53 -- setup/hugepages.sh@93 -- # local resv 00:04:38.173 20:51:53 -- setup/hugepages.sh@94 -- # local anon 00:04:38.173 20:51:53 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:38.173 20:51:53 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:38.173 20:51:53 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:38.173 20:51:53 -- setup/common.sh@18 -- # local node= 00:04:38.173 20:51:53 -- setup/common.sh@19 -- # local var val 00:04:38.173 20:51:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:38.173 20:51:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.173 20:51:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.173 20:51:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.173 20:51:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.173 20:51:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.173 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.173 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.173 20:51:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41542556 kB' 'MemAvailable: 45150940 kB' 'Buffers: 10416 kB' 'Cached: 12756544 kB' 'SwapCached: 0 kB' 'Active: 9966788 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301696 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536676 kB' 'Mapped: 183500 kB' 'Shmem: 8786240 kB' 'KReclaimable: 249608 kB' 'Slab: 765776 kB' 'SReclaimable: 249608 kB' 'SUnreclaim: 516168 kB' 'KernelStack: 22224 kB' 'PageTables: 8960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10610480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213816 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:38.173 20:51:53 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.173 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.173 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.173 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.173 20:51:53 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.173 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.173 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.174 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.174 20:51:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.174 20:51:53 -- setup/common.sh@33 -- # echo 0 00:04:38.174 20:51:53 -- setup/common.sh@33 -- # return 0 00:04:38.174 20:51:53 -- setup/hugepages.sh@97 -- # anon=0 00:04:38.174 20:51:53 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:38.174 20:51:53 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.174 20:51:53 -- setup/common.sh@18 -- # local node= 00:04:38.174 20:51:53 -- setup/common.sh@19 -- # local var val 00:04:38.174 20:51:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:38.174 20:51:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.174 20:51:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.174 20:51:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.174 20:51:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.175 20:51:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41542052 kB' 'MemAvailable: 45150436 kB' 'Buffers: 10416 kB' 'Cached: 12756548 kB' 'SwapCached: 0 kB' 'Active: 9966108 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301016 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535960 kB' 'Mapped: 183496 kB' 'Shmem: 8786244 kB' 'KReclaimable: 249608 kB' 'Slab: 765804 kB' 'SReclaimable: 249608 kB' 'SUnreclaim: 516196 kB' 'KernelStack: 22080 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10610492 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213736 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.175 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.175 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.176 20:51:53 -- setup/common.sh@33 -- # echo 0 00:04:38.176 20:51:53 -- setup/common.sh@33 -- # return 0 00:04:38.176 20:51:53 -- setup/hugepages.sh@99 -- # surp=0 00:04:38.176 20:51:53 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:38.176 20:51:53 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:38.176 20:51:53 -- setup/common.sh@18 -- # local node= 00:04:38.176 20:51:53 -- setup/common.sh@19 -- # local var val 00:04:38.176 20:51:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:38.176 20:51:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.176 20:51:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.176 20:51:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.176 20:51:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.176 20:51:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41540724 kB' 'MemAvailable: 45149108 kB' 'Buffers: 10416 kB' 'Cached: 12756560 kB' 'SwapCached: 0 kB' 'Active: 9966728 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301636 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536564 kB' 'Mapped: 183496 kB' 'Shmem: 8786256 kB' 'KReclaimable: 249608 kB' 'Slab: 765804 kB' 'SReclaimable: 249608 kB' 'SUnreclaim: 516196 kB' 'KernelStack: 22288 kB' 'PageTables: 9120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10610508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213800 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.176 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.176 20:51:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.177 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.177 20:51:53 -- setup/common.sh@33 -- # echo 0 00:04:38.177 20:51:53 -- setup/common.sh@33 -- # return 0 00:04:38.177 20:51:53 -- setup/hugepages.sh@100 -- # resv=0 00:04:38.177 20:51:53 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:38.177 nr_hugepages=1024 00:04:38.177 20:51:53 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:38.177 resv_hugepages=0 00:04:38.177 20:51:53 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:38.177 surplus_hugepages=0 00:04:38.177 20:51:53 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:38.177 anon_hugepages=0 00:04:38.177 20:51:53 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.177 20:51:53 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:38.177 20:51:53 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:38.177 20:51:53 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:38.177 20:51:53 -- setup/common.sh@18 -- # local node= 00:04:38.177 20:51:53 -- setup/common.sh@19 -- # local var val 00:04:38.177 20:51:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:38.177 20:51:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.177 20:51:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.177 20:51:53 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.177 20:51:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.177 20:51:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.177 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295220 kB' 'MemFree: 41543400 kB' 'MemAvailable: 45151784 kB' 'Buffers: 10416 kB' 'Cached: 12756572 kB' 'SwapCached: 0 kB' 'Active: 9966108 kB' 'Inactive: 3333564 kB' 'Active(anon): 9301016 kB' 'Inactive(anon): 17936 kB' 'Active(file): 665092 kB' 'Inactive(file): 3315628 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 535932 kB' 'Mapped: 183496 kB' 'Shmem: 8786268 kB' 'KReclaimable: 249608 kB' 'Slab: 765804 kB' 'SReclaimable: 249608 kB' 'SUnreclaim: 516196 kB' 'KernelStack: 22064 kB' 'PageTables: 8064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487636 kB' 'Committed_AS: 10609008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213640 kB' 'VmallocChunk: 0 kB' 'Percpu: 73024 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 488820 kB' 'DirectMap2M: 9682944 kB' 'DirectMap1G: 58720256 kB' 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.178 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.178 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.179 20:51:53 -- setup/common.sh@33 -- # echo 1024 00:04:38.179 20:51:53 -- setup/common.sh@33 -- # return 0 00:04:38.179 20:51:53 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.179 20:51:53 -- setup/hugepages.sh@112 -- # get_nodes 00:04:38.179 20:51:53 -- setup/hugepages.sh@27 -- # local node 00:04:38.179 20:51:53 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.179 20:51:53 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:38.179 20:51:53 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.179 20:51:53 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:38.179 20:51:53 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:38.179 20:51:53 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:38.179 20:51:53 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.179 20:51:53 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.179 20:51:53 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:38.179 20:51:53 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.179 20:51:53 -- setup/common.sh@18 -- # local node=0 00:04:38.179 20:51:53 -- setup/common.sh@19 -- # local var val 00:04:38.179 20:51:53 -- setup/common.sh@20 -- # local mem_f mem 00:04:38.179 20:51:53 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.179 20:51:53 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:38.179 20:51:53 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:38.179 20:51:53 -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.179 20:51:53 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32639140 kB' 'MemFree: 23929112 kB' 'MemUsed: 8710028 kB' 'SwapCached: 0 kB' 'Active: 4607300 kB' 'Inactive: 102364 kB' 'Active(anon): 4241804 kB' 'Inactive(anon): 0 kB' 'Active(file): 365496 kB' 'Inactive(file): 102364 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4405728 kB' 'Mapped: 134156 kB' 'AnonPages: 307036 kB' 'Shmem: 3937868 kB' 'KernelStack: 12824 kB' 'PageTables: 6336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124056 kB' 'Slab: 400868 kB' 'SReclaimable: 124056 kB' 'SUnreclaim: 276812 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.179 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.179 20:51:53 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # continue 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # IFS=': ' 00:04:38.180 20:51:53 -- setup/common.sh@31 -- # read -r var val _ 00:04:38.180 20:51:53 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.180 20:51:53 -- setup/common.sh@33 -- # echo 0 00:04:38.180 20:51:53 -- setup/common.sh@33 -- # return 0 00:04:38.180 20:51:53 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.180 20:51:53 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.180 20:51:53 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.180 20:51:53 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.180 20:51:53 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:38.180 node0=1024 expecting 1024 00:04:38.180 20:51:53 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:38.180 00:04:38.180 real 0m7.034s 00:04:38.180 user 0m2.621s 00:04:38.180 sys 0m4.506s 00:04:38.180 20:51:53 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:38.180 20:51:53 -- common/autotest_common.sh@10 -- # set +x 00:04:38.180 ************************************ 00:04:38.180 END TEST no_shrink_alloc 00:04:38.180 ************************************ 00:04:38.440 20:51:53 -- setup/hugepages.sh@217 -- # clear_hp 00:04:38.440 20:51:53 -- setup/hugepages.sh@37 -- # local node hp 00:04:38.440 20:51:53 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:38.440 20:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:38.440 20:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:04:38.440 20:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:38.440 20:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:04:38.440 20:51:53 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:38.440 20:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:38.440 20:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:04:38.440 20:51:53 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:38.440 20:51:53 -- setup/hugepages.sh@41 -- # echo 0 00:04:38.440 20:51:53 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:38.440 20:51:53 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:38.440 00:04:38.440 real 0m26.928s 00:04:38.440 user 0m9.478s 00:04:38.440 sys 0m16.085s 00:04:38.440 20:51:53 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:38.440 20:51:53 -- common/autotest_common.sh@10 -- # set +x 00:04:38.440 ************************************ 00:04:38.440 END TEST hugepages 00:04:38.440 ************************************ 00:04:38.440 20:51:53 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:38.440 20:51:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:38.440 20:51:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:38.440 20:51:53 -- common/autotest_common.sh@10 -- # set +x 00:04:38.440 ************************************ 00:04:38.440 START TEST driver 00:04:38.440 ************************************ 00:04:38.440 20:51:54 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:38.700 * Looking for test storage... 00:04:38.700 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:38.700 20:51:54 -- setup/driver.sh@68 -- # setup reset 00:04:38.700 20:51:54 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:38.700 20:51:54 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:43.979 20:51:58 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:43.979 20:51:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:43.980 20:51:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:43.980 20:51:58 -- common/autotest_common.sh@10 -- # set +x 00:04:43.980 ************************************ 00:04:43.980 START TEST guess_driver 00:04:43.980 ************************************ 00:04:43.980 20:51:58 -- common/autotest_common.sh@1111 -- # guess_driver 00:04:43.980 20:51:58 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:43.980 20:51:58 -- setup/driver.sh@47 -- # local fail=0 00:04:43.980 20:51:58 -- setup/driver.sh@49 -- # pick_driver 00:04:43.980 20:51:58 -- setup/driver.sh@36 -- # vfio 00:04:43.980 20:51:58 -- setup/driver.sh@21 -- # local iommu_grups 00:04:43.980 20:51:58 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:43.980 20:51:58 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:43.980 20:51:58 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:43.980 20:51:58 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:43.980 20:51:58 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:43.980 20:51:58 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:43.980 20:51:58 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:43.980 20:51:58 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:43.980 20:51:58 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:43.980 20:51:58 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:43.980 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:43.980 20:51:58 -- setup/driver.sh@30 -- # return 0 00:04:43.980 20:51:58 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:43.980 20:51:58 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:43.980 20:51:58 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:43.980 20:51:58 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:43.980 Looking for driver=vfio-pci 00:04:43.980 20:51:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:43.980 20:51:58 -- setup/driver.sh@45 -- # setup output config 00:04:43.980 20:51:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.980 20:51:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:46.518 20:52:01 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:46.518 20:52:01 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:46.518 20:52:01 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:47.892 20:52:03 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:47.892 20:52:03 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:47.892 20:52:03 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:47.892 20:52:03 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:47.892 20:52:03 -- setup/driver.sh@65 -- # setup reset 00:04:47.892 20:52:03 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:47.892 20:52:03 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:53.226 00:04:53.226 real 0m9.249s 00:04:53.226 user 0m2.315s 00:04:53.226 sys 0m4.507s 00:04:53.226 20:52:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:53.226 20:52:07 -- common/autotest_common.sh@10 -- # set +x 00:04:53.226 ************************************ 00:04:53.226 END TEST guess_driver 00:04:53.226 ************************************ 00:04:53.226 00:04:53.226 real 0m13.928s 00:04:53.226 user 0m3.545s 00:04:53.226 sys 0m7.119s 00:04:53.226 20:52:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:53.226 20:52:08 -- common/autotest_common.sh@10 -- # set +x 00:04:53.226 ************************************ 00:04:53.226 END TEST driver 00:04:53.226 ************************************ 00:04:53.226 20:52:08 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:53.226 20:52:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:53.226 20:52:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:53.226 20:52:08 -- common/autotest_common.sh@10 -- # set +x 00:04:53.226 ************************************ 00:04:53.226 START TEST devices 00:04:53.226 ************************************ 00:04:53.226 20:52:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:53.226 * Looking for test storage... 00:04:53.226 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:53.226 20:52:08 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:53.226 20:52:08 -- setup/devices.sh@192 -- # setup reset 00:04:53.226 20:52:08 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:53.226 20:52:08 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:56.541 20:52:11 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:56.541 20:52:11 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:56.541 20:52:11 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:56.541 20:52:11 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:56.541 20:52:11 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:56.541 20:52:11 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:56.541 20:52:11 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:56.541 20:52:11 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:56.541 20:52:11 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:56.541 20:52:11 -- setup/devices.sh@196 -- # blocks=() 00:04:56.541 20:52:11 -- setup/devices.sh@196 -- # declare -a blocks 00:04:56.541 20:52:11 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:56.541 20:52:11 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:56.541 20:52:11 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:56.541 20:52:11 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:56.541 20:52:11 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:56.541 20:52:11 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:56.541 20:52:11 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:56.541 20:52:11 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:56.541 20:52:11 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:56.542 20:52:11 -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:56.542 20:52:11 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:56.542 No valid GPT data, bailing 00:04:56.542 20:52:12 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:56.542 20:52:12 -- scripts/common.sh@391 -- # pt= 00:04:56.542 20:52:12 -- scripts/common.sh@392 -- # return 1 00:04:56.542 20:52:12 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:56.542 20:52:12 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:56.542 20:52:12 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:56.542 20:52:12 -- setup/common.sh@80 -- # echo 1600321314816 00:04:56.542 20:52:12 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:56.542 20:52:12 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:56.542 20:52:12 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:56.542 20:52:12 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:56.542 20:52:12 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:56.542 20:52:12 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:56.542 20:52:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:56.542 20:52:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:56.542 20:52:12 -- common/autotest_common.sh@10 -- # set +x 00:04:56.542 ************************************ 00:04:56.542 START TEST nvme_mount 00:04:56.542 ************************************ 00:04:56.542 20:52:12 -- common/autotest_common.sh@1111 -- # nvme_mount 00:04:56.542 20:52:12 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:56.542 20:52:12 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:56.542 20:52:12 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:56.542 20:52:12 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:56.542 20:52:12 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:56.542 20:52:12 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:56.542 20:52:12 -- setup/common.sh@40 -- # local part_no=1 00:04:56.542 20:52:12 -- setup/common.sh@41 -- # local size=1073741824 00:04:56.542 20:52:12 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:56.542 20:52:12 -- setup/common.sh@44 -- # parts=() 00:04:56.542 20:52:12 -- setup/common.sh@44 -- # local parts 00:04:56.542 20:52:12 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:56.542 20:52:12 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:56.542 20:52:12 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:56.542 20:52:12 -- setup/common.sh@46 -- # (( part++ )) 00:04:56.542 20:52:12 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:56.542 20:52:12 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:56.542 20:52:12 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:56.542 20:52:12 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:57.922 Creating new GPT entries in memory. 00:04:57.922 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:57.922 other utilities. 00:04:57.922 20:52:13 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:57.922 20:52:13 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:57.922 20:52:13 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:57.922 20:52:13 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:57.922 20:52:13 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:58.860 Creating new GPT entries in memory. 00:04:58.860 The operation has completed successfully. 00:04:58.860 20:52:14 -- setup/common.sh@57 -- # (( part++ )) 00:04:58.860 20:52:14 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:58.860 20:52:14 -- setup/common.sh@62 -- # wait 154895 00:04:58.861 20:52:14 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.861 20:52:14 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:58.861 20:52:14 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.861 20:52:14 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:58.861 20:52:14 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:58.861 20:52:14 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.861 20:52:14 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:58.861 20:52:14 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:58.861 20:52:14 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:58.861 20:52:14 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.861 20:52:14 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:58.861 20:52:14 -- setup/devices.sh@53 -- # local found=0 00:04:58.861 20:52:14 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:58.861 20:52:14 -- setup/devices.sh@56 -- # : 00:04:58.861 20:52:14 -- setup/devices.sh@59 -- # local pci status 00:04:58.861 20:52:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.861 20:52:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:58.861 20:52:14 -- setup/devices.sh@47 -- # setup output config 00:04:58.861 20:52:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.861 20:52:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:02.152 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.152 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:02.153 20:52:17 -- setup/devices.sh@63 -- # found=1 00:05:02.153 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.153 20:52:17 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:02.153 20:52:17 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:02.153 20:52:17 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.153 20:52:17 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:02.153 20:52:17 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:02.153 20:52:17 -- setup/devices.sh@110 -- # cleanup_nvme 00:05:02.153 20:52:17 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.153 20:52:17 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.153 20:52:17 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:02.153 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:02.153 20:52:17 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:02.153 20:52:17 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:02.412 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:02.412 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:02.412 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:02.412 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:02.412 20:52:17 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:02.412 20:52:17 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:02.412 20:52:17 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.412 20:52:17 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:02.412 20:52:17 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:02.412 20:52:17 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.412 20:52:17 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:02.412 20:52:17 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:02.412 20:52:17 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:02.412 20:52:17 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:02.412 20:52:17 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:02.412 20:52:17 -- setup/devices.sh@53 -- # local found=0 00:05:02.412 20:52:17 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:02.412 20:52:17 -- setup/devices.sh@56 -- # : 00:05:02.412 20:52:17 -- setup/devices.sh@59 -- # local pci status 00:05:02.412 20:52:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:02.412 20:52:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:02.412 20:52:17 -- setup/devices.sh@47 -- # setup output config 00:05:02.412 20:52:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:02.412 20:52:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:05.702 20:52:20 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:05.702 20:52:20 -- setup/devices.sh@63 -- # found=1 00:05:05.702 20:52:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:21 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:05.702 20:52:21 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:05.702 20:52:21 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.702 20:52:21 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:05.702 20:52:21 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:05.702 20:52:21 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.702 20:52:21 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:05.702 20:52:21 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:05.702 20:52:21 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:05.702 20:52:21 -- setup/devices.sh@50 -- # local mount_point= 00:05:05.702 20:52:21 -- setup/devices.sh@51 -- # local test_file= 00:05:05.702 20:52:21 -- setup/devices.sh@53 -- # local found=0 00:05:05.702 20:52:21 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:05.702 20:52:21 -- setup/devices.sh@59 -- # local pci status 00:05:05.702 20:52:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.702 20:52:21 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:05.702 20:52:21 -- setup/devices.sh@47 -- # setup output config 00:05:05.702 20:52:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:05.702 20:52:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:08.990 20:52:24 -- setup/devices.sh@63 -- # found=1 00:05:08.990 20:52:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.990 20:52:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:08.990 20:52:24 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:08.990 20:52:24 -- setup/devices.sh@68 -- # return 0 00:05:08.990 20:52:24 -- setup/devices.sh@128 -- # cleanup_nvme 00:05:08.990 20:52:24 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:08.990 20:52:24 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:08.990 20:52:24 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:08.990 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:08.990 00:05:08.990 real 0m12.207s 00:05:08.990 user 0m3.523s 00:05:08.990 sys 0m6.560s 00:05:08.990 20:52:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:08.990 20:52:24 -- common/autotest_common.sh@10 -- # set +x 00:05:08.990 ************************************ 00:05:08.990 END TEST nvme_mount 00:05:08.990 ************************************ 00:05:08.990 20:52:24 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:08.990 20:52:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:08.990 20:52:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:08.990 20:52:24 -- common/autotest_common.sh@10 -- # set +x 00:05:08.990 ************************************ 00:05:08.990 START TEST dm_mount 00:05:08.990 ************************************ 00:05:08.990 20:52:24 -- common/autotest_common.sh@1111 -- # dm_mount 00:05:08.990 20:52:24 -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:08.990 20:52:24 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:08.990 20:52:24 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:08.990 20:52:24 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:08.990 20:52:24 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:08.990 20:52:24 -- setup/common.sh@40 -- # local part_no=2 00:05:08.990 20:52:24 -- setup/common.sh@41 -- # local size=1073741824 00:05:08.990 20:52:24 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:08.990 20:52:24 -- setup/common.sh@44 -- # parts=() 00:05:08.990 20:52:24 -- setup/common.sh@44 -- # local parts 00:05:08.990 20:52:24 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:08.990 20:52:24 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:08.990 20:52:24 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:08.990 20:52:24 -- setup/common.sh@46 -- # (( part++ )) 00:05:08.990 20:52:24 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:08.990 20:52:24 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:08.990 20:52:24 -- setup/common.sh@46 -- # (( part++ )) 00:05:08.990 20:52:24 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:08.990 20:52:24 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:08.990 20:52:24 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:08.990 20:52:24 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:10.370 Creating new GPT entries in memory. 00:05:10.370 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:10.370 other utilities. 00:05:10.370 20:52:25 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:10.370 20:52:25 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:10.370 20:52:25 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:10.370 20:52:25 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:10.370 20:52:25 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:11.304 Creating new GPT entries in memory. 00:05:11.304 The operation has completed successfully. 00:05:11.304 20:52:26 -- setup/common.sh@57 -- # (( part++ )) 00:05:11.304 20:52:26 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:11.304 20:52:26 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:11.305 20:52:26 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:11.305 20:52:26 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:12.242 The operation has completed successfully. 00:05:12.242 20:52:27 -- setup/common.sh@57 -- # (( part++ )) 00:05:12.242 20:52:27 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:12.242 20:52:27 -- setup/common.sh@62 -- # wait 159324 00:05:12.242 20:52:27 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:12.242 20:52:27 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:12.242 20:52:27 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:12.242 20:52:27 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:12.242 20:52:27 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:12.242 20:52:27 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:12.242 20:52:27 -- setup/devices.sh@161 -- # break 00:05:12.242 20:52:27 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:12.242 20:52:27 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:12.242 20:52:27 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:12.242 20:52:27 -- setup/devices.sh@166 -- # dm=dm-0 00:05:12.242 20:52:27 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:12.242 20:52:27 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:12.242 20:52:27 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:12.242 20:52:27 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:12.242 20:52:27 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:12.242 20:52:27 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:12.242 20:52:27 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:12.242 20:52:27 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:12.242 20:52:27 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:12.242 20:52:27 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:12.242 20:52:27 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:12.242 20:52:27 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:12.242 20:52:27 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:12.242 20:52:27 -- setup/devices.sh@53 -- # local found=0 00:05:12.242 20:52:27 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:12.242 20:52:27 -- setup/devices.sh@56 -- # : 00:05:12.242 20:52:27 -- setup/devices.sh@59 -- # local pci status 00:05:12.242 20:52:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:12.242 20:52:27 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:12.242 20:52:27 -- setup/devices.sh@47 -- # setup output config 00:05:12.242 20:52:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:12.242 20:52:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.773 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.773 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.774 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.774 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.774 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.774 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.774 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.774 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.774 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.774 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.774 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.774 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:15.032 20:52:30 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:15.032 20:52:30 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:15.032 20:52:30 -- setup/devices.sh@63 -- # found=1 00:05:15.032 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:15.032 20:52:30 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:15.032 20:52:30 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:15.032 20:52:30 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:15.032 20:52:30 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:15.032 20:52:30 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:15.032 20:52:30 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:15.032 20:52:30 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:15.032 20:52:30 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:15.032 20:52:30 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:15.032 20:52:30 -- setup/devices.sh@50 -- # local mount_point= 00:05:15.032 20:52:30 -- setup/devices.sh@51 -- # local test_file= 00:05:15.032 20:52:30 -- setup/devices.sh@53 -- # local found=0 00:05:15.032 20:52:30 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:15.032 20:52:30 -- setup/devices.sh@59 -- # local pci status 00:05:15.032 20:52:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:15.032 20:52:30 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:15.032 20:52:30 -- setup/devices.sh@47 -- # setup output config 00:05:15.033 20:52:30 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:15.033 20:52:30 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:18.322 20:52:33 -- setup/devices.sh@63 -- # found=1 00:05:18.322 20:52:33 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:18.322 20:52:33 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:18.322 20:52:33 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:18.322 20:52:33 -- setup/devices.sh@68 -- # return 0 00:05:18.322 20:52:33 -- setup/devices.sh@187 -- # cleanup_dm 00:05:18.322 20:52:33 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:18.322 20:52:33 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:18.322 20:52:33 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:18.322 20:52:33 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:18.322 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:18.322 20:52:33 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:18.322 00:05:18.322 real 0m9.177s 00:05:18.322 user 0m2.097s 00:05:18.322 sys 0m4.062s 00:05:18.322 20:52:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:18.322 20:52:33 -- common/autotest_common.sh@10 -- # set +x 00:05:18.322 ************************************ 00:05:18.322 END TEST dm_mount 00:05:18.322 ************************************ 00:05:18.322 20:52:33 -- setup/devices.sh@1 -- # cleanup 00:05:18.322 20:52:33 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:18.322 20:52:33 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:18.322 20:52:33 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:18.322 20:52:33 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:18.322 20:52:33 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:18.581 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:18.581 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:18.581 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:18.581 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:18.581 20:52:34 -- setup/devices.sh@12 -- # cleanup_dm 00:05:18.581 20:52:34 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:18.581 20:52:34 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:18.581 20:52:34 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:18.581 20:52:34 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:18.581 20:52:34 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:18.581 20:52:34 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:18.581 00:05:18.581 real 0m25.907s 00:05:18.581 user 0m7.186s 00:05:18.581 sys 0m13.434s 00:05:18.581 20:52:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:18.581 20:52:34 -- common/autotest_common.sh@10 -- # set +x 00:05:18.581 ************************************ 00:05:18.581 END TEST devices 00:05:18.581 ************************************ 00:05:18.581 00:05:18.581 real 1m31.445s 00:05:18.581 user 0m28.149s 00:05:18.581 sys 0m51.372s 00:05:18.581 20:52:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:18.581 20:52:34 -- common/autotest_common.sh@10 -- # set +x 00:05:18.581 ************************************ 00:05:18.581 END TEST setup.sh 00:05:18.581 ************************************ 00:05:18.581 20:52:34 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:21.862 Hugepages 00:05:21.862 node hugesize free / total 00:05:21.862 node0 1048576kB 0 / 0 00:05:21.862 node0 2048kB 2048 / 2048 00:05:21.862 node1 1048576kB 0 / 0 00:05:21.862 node1 2048kB 0 / 0 00:05:21.862 00:05:21.862 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:21.862 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:21.862 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:21.862 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:21.862 20:52:37 -- spdk/autotest.sh@130 -- # uname -s 00:05:21.862 20:52:37 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:21.862 20:52:37 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:21.862 20:52:37 -- common/autotest_common.sh@1517 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:25.148 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:25.148 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:26.524 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:26.524 20:52:42 -- common/autotest_common.sh@1518 -- # sleep 1 00:05:27.460 20:52:43 -- common/autotest_common.sh@1519 -- # bdfs=() 00:05:27.460 20:52:43 -- common/autotest_common.sh@1519 -- # local bdfs 00:05:27.460 20:52:43 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:05:27.460 20:52:43 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:05:27.460 20:52:43 -- common/autotest_common.sh@1499 -- # bdfs=() 00:05:27.460 20:52:43 -- common/autotest_common.sh@1499 -- # local bdfs 00:05:27.460 20:52:43 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:27.460 20:52:43 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:27.460 20:52:43 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:05:27.719 20:52:43 -- common/autotest_common.sh@1501 -- # (( 1 == 0 )) 00:05:27.719 20:52:43 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:d8:00.0 00:05:27.719 20:52:43 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:31.044 Waiting for block devices as requested 00:05:31.044 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:31.044 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:31.044 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:31.044 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:31.044 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:31.044 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:31.044 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:31.303 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:31.303 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:31.303 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:31.562 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:31.563 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:31.563 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:31.821 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:31.821 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:31.821 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:32.081 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:32.081 20:52:47 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:05:32.081 20:52:47 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1488 -- # readlink -f /sys/class/nvme/nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1488 -- # grep 0000:d8:00.0/nvme/nvme 00:05:32.081 20:52:47 -- common/autotest_common.sh@1488 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1489 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:32.081 20:52:47 -- common/autotest_common.sh@1493 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1493 -- # printf '%s\n' nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:05:32.081 20:52:47 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1531 -- # grep oacs 00:05:32.081 20:52:47 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:05:32.081 20:52:47 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:05:32.081 20:52:47 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:05:32.081 20:52:47 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:05:32.081 20:52:47 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:32.081 20:52:47 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:32.081 20:52:47 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:05:32.081 20:52:47 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:05:32.081 20:52:47 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:05:32.081 20:52:47 -- common/autotest_common.sh@1543 -- # continue 00:05:32.081 20:52:47 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:32.081 20:52:47 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:32.081 20:52:47 -- common/autotest_common.sh@10 -- # set +x 00:05:32.341 20:52:47 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:32.341 20:52:47 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:32.341 20:52:47 -- common/autotest_common.sh@10 -- # set +x 00:05:32.341 20:52:47 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:35.705 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:35.705 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:37.086 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:37.345 20:52:52 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:37.345 20:52:52 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:37.345 20:52:52 -- common/autotest_common.sh@10 -- # set +x 00:05:37.345 20:52:52 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:37.345 20:52:52 -- common/autotest_common.sh@1577 -- # mapfile -t bdfs 00:05:37.345 20:52:52 -- common/autotest_common.sh@1577 -- # get_nvme_bdfs_by_id 0x0a54 00:05:37.345 20:52:52 -- common/autotest_common.sh@1563 -- # bdfs=() 00:05:37.345 20:52:52 -- common/autotest_common.sh@1563 -- # local bdfs 00:05:37.345 20:52:52 -- common/autotest_common.sh@1565 -- # get_nvme_bdfs 00:05:37.345 20:52:52 -- common/autotest_common.sh@1499 -- # bdfs=() 00:05:37.345 20:52:52 -- common/autotest_common.sh@1499 -- # local bdfs 00:05:37.345 20:52:52 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:37.345 20:52:52 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:37.345 20:52:52 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:05:37.345 20:52:52 -- common/autotest_common.sh@1501 -- # (( 1 == 0 )) 00:05:37.345 20:52:52 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:d8:00.0 00:05:37.345 20:52:52 -- common/autotest_common.sh@1565 -- # for bdf in $(get_nvme_bdfs) 00:05:37.345 20:52:52 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:37.345 20:52:52 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:05:37.345 20:52:52 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:37.345 20:52:52 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:05:37.345 20:52:52 -- common/autotest_common.sh@1572 -- # printf '%s\n' 0000:d8:00.0 00:05:37.345 20:52:52 -- common/autotest_common.sh@1578 -- # [[ -z 0000:d8:00.0 ]] 00:05:37.345 20:52:52 -- common/autotest_common.sh@1583 -- # spdk_tgt_pid=168626 00:05:37.345 20:52:52 -- common/autotest_common.sh@1584 -- # waitforlisten 168626 00:05:37.345 20:52:52 -- common/autotest_common.sh@1582 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:37.345 20:52:52 -- common/autotest_common.sh@817 -- # '[' -z 168626 ']' 00:05:37.345 20:52:52 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:37.345 20:52:52 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:37.345 20:52:52 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:37.345 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:37.345 20:52:52 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:37.345 20:52:52 -- common/autotest_common.sh@10 -- # set +x 00:05:37.345 [2024-04-25 20:52:52.969276] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:05:37.345 [2024-04-25 20:52:52.969362] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid168626 ] 00:05:37.345 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.345 [2024-04-25 20:52:53.007246] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:37.604 [2024-04-25 20:52:53.040659] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.604 [2024-04-25 20:52:53.081115] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.604 20:52:53 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:37.604 20:52:53 -- common/autotest_common.sh@850 -- # return 0 00:05:37.604 20:52:53 -- common/autotest_common.sh@1586 -- # bdf_id=0 00:05:37.604 20:52:53 -- common/autotest_common.sh@1587 -- # for bdf in "${bdfs[@]}" 00:05:37.604 20:52:53 -- common/autotest_common.sh@1588 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:40.896 nvme0n1 00:05:40.896 20:52:56 -- common/autotest_common.sh@1590 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:40.896 [2024-04-25 20:52:56.407499] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:40.896 request: 00:05:40.896 { 00:05:40.896 "nvme_ctrlr_name": "nvme0", 00:05:40.896 "password": "test", 00:05:40.896 "method": "bdev_nvme_opal_revert", 00:05:40.896 "req_id": 1 00:05:40.896 } 00:05:40.896 Got JSON-RPC error response 00:05:40.896 response: 00:05:40.896 { 00:05:40.896 "code": -32602, 00:05:40.896 "message": "Invalid parameters" 00:05:40.896 } 00:05:40.896 20:52:56 -- common/autotest_common.sh@1590 -- # true 00:05:40.896 20:52:56 -- common/autotest_common.sh@1591 -- # (( ++bdf_id )) 00:05:40.896 20:52:56 -- common/autotest_common.sh@1594 -- # killprocess 168626 00:05:40.896 20:52:56 -- common/autotest_common.sh@936 -- # '[' -z 168626 ']' 00:05:40.896 20:52:56 -- common/autotest_common.sh@940 -- # kill -0 168626 00:05:40.896 20:52:56 -- common/autotest_common.sh@941 -- # uname 00:05:40.896 20:52:56 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:40.896 20:52:56 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 168626 00:05:40.896 20:52:56 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:40.896 20:52:56 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:40.896 20:52:56 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 168626' 00:05:40.896 killing process with pid 168626 00:05:40.896 20:52:56 -- common/autotest_common.sh@955 -- # kill 168626 00:05:40.896 20:52:56 -- common/autotest_common.sh@960 -- # wait 168626 00:05:43.432 20:52:58 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:43.432 20:52:58 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:43.432 20:52:58 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:43.432 20:52:58 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:43.432 20:52:58 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:43.432 20:52:58 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:43.432 20:52:58 -- common/autotest_common.sh@10 -- # set +x 00:05:43.432 20:52:58 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:43.432 20:52:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.432 20:52:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.432 20:52:58 -- common/autotest_common.sh@10 -- # set +x 00:05:43.432 ************************************ 00:05:43.432 START TEST env 00:05:43.432 ************************************ 00:05:43.432 20:52:58 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:43.432 * Looking for test storage... 00:05:43.432 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:43.432 20:52:58 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:43.432 20:52:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.432 20:52:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.432 20:52:58 -- common/autotest_common.sh@10 -- # set +x 00:05:43.432 ************************************ 00:05:43.432 START TEST env_memory 00:05:43.432 ************************************ 00:05:43.432 20:52:59 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:43.432 00:05:43.432 00:05:43.432 CUnit - A unit testing framework for C - Version 2.1-3 00:05:43.432 http://cunit.sourceforge.net/ 00:05:43.432 00:05:43.432 00:05:43.432 Suite: memory 00:05:43.432 Test: alloc and free memory map ...[2024-04-25 20:52:59.092744] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:43.691 passed 00:05:43.691 Test: mem map translation ...[2024-04-25 20:52:59.105920] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:43.691 [2024-04-25 20:52:59.105936] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:43.691 [2024-04-25 20:52:59.105969] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:43.691 [2024-04-25 20:52:59.105978] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:43.691 passed 00:05:43.691 Test: mem map registration ...[2024-04-25 20:52:59.127136] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:43.691 [2024-04-25 20:52:59.127152] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:43.691 passed 00:05:43.691 Test: mem map adjacent registrations ...passed 00:05:43.691 00:05:43.691 Run Summary: Type Total Ran Passed Failed Inactive 00:05:43.691 suites 1 1 n/a 0 0 00:05:43.691 tests 4 4 4 0 0 00:05:43.691 asserts 152 152 152 0 n/a 00:05:43.691 00:05:43.691 Elapsed time = 0.086 seconds 00:05:43.691 00:05:43.691 real 0m0.099s 00:05:43.691 user 0m0.090s 00:05:43.691 sys 0m0.009s 00:05:43.691 20:52:59 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:43.691 20:52:59 -- common/autotest_common.sh@10 -- # set +x 00:05:43.691 ************************************ 00:05:43.691 END TEST env_memory 00:05:43.691 ************************************ 00:05:43.691 20:52:59 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:43.691 20:52:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.691 20:52:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.691 20:52:59 -- common/autotest_common.sh@10 -- # set +x 00:05:43.951 ************************************ 00:05:43.951 START TEST env_vtophys 00:05:43.951 ************************************ 00:05:43.951 20:52:59 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:43.951 EAL: lib.eal log level changed from notice to debug 00:05:43.951 EAL: Detected lcore 0 as core 0 on socket 0 00:05:43.951 EAL: Detected lcore 1 as core 1 on socket 0 00:05:43.951 EAL: Detected lcore 2 as core 2 on socket 0 00:05:43.951 EAL: Detected lcore 3 as core 3 on socket 0 00:05:43.951 EAL: Detected lcore 4 as core 4 on socket 0 00:05:43.951 EAL: Detected lcore 5 as core 5 on socket 0 00:05:43.951 EAL: Detected lcore 6 as core 6 on socket 0 00:05:43.951 EAL: Detected lcore 7 as core 8 on socket 0 00:05:43.951 EAL: Detected lcore 8 as core 9 on socket 0 00:05:43.951 EAL: Detected lcore 9 as core 10 on socket 0 00:05:43.951 EAL: Detected lcore 10 as core 11 on socket 0 00:05:43.951 EAL: Detected lcore 11 as core 12 on socket 0 00:05:43.951 EAL: Detected lcore 12 as core 13 on socket 0 00:05:43.951 EAL: Detected lcore 13 as core 14 on socket 0 00:05:43.951 EAL: Detected lcore 14 as core 16 on socket 0 00:05:43.951 EAL: Detected lcore 15 as core 17 on socket 0 00:05:43.951 EAL: Detected lcore 16 as core 18 on socket 0 00:05:43.951 EAL: Detected lcore 17 as core 19 on socket 0 00:05:43.952 EAL: Detected lcore 18 as core 20 on socket 0 00:05:43.952 EAL: Detected lcore 19 as core 21 on socket 0 00:05:43.952 EAL: Detected lcore 20 as core 22 on socket 0 00:05:43.952 EAL: Detected lcore 21 as core 24 on socket 0 00:05:43.952 EAL: Detected lcore 22 as core 25 on socket 0 00:05:43.952 EAL: Detected lcore 23 as core 26 on socket 0 00:05:43.952 EAL: Detected lcore 24 as core 27 on socket 0 00:05:43.952 EAL: Detected lcore 25 as core 28 on socket 0 00:05:43.952 EAL: Detected lcore 26 as core 29 on socket 0 00:05:43.952 EAL: Detected lcore 27 as core 30 on socket 0 00:05:43.952 EAL: Detected lcore 28 as core 0 on socket 1 00:05:43.952 EAL: Detected lcore 29 as core 1 on socket 1 00:05:43.952 EAL: Detected lcore 30 as core 2 on socket 1 00:05:43.952 EAL: Detected lcore 31 as core 3 on socket 1 00:05:43.952 EAL: Detected lcore 32 as core 4 on socket 1 00:05:43.952 EAL: Detected lcore 33 as core 5 on socket 1 00:05:43.952 EAL: Detected lcore 34 as core 6 on socket 1 00:05:43.952 EAL: Detected lcore 35 as core 8 on socket 1 00:05:43.952 EAL: Detected lcore 36 as core 9 on socket 1 00:05:43.952 EAL: Detected lcore 37 as core 10 on socket 1 00:05:43.952 EAL: Detected lcore 38 as core 11 on socket 1 00:05:43.952 EAL: Detected lcore 39 as core 12 on socket 1 00:05:43.952 EAL: Detected lcore 40 as core 13 on socket 1 00:05:43.952 EAL: Detected lcore 41 as core 14 on socket 1 00:05:43.952 EAL: Detected lcore 42 as core 16 on socket 1 00:05:43.952 EAL: Detected lcore 43 as core 17 on socket 1 00:05:43.952 EAL: Detected lcore 44 as core 18 on socket 1 00:05:43.952 EAL: Detected lcore 45 as core 19 on socket 1 00:05:43.952 EAL: Detected lcore 46 as core 20 on socket 1 00:05:43.952 EAL: Detected lcore 47 as core 21 on socket 1 00:05:43.952 EAL: Detected lcore 48 as core 22 on socket 1 00:05:43.952 EAL: Detected lcore 49 as core 24 on socket 1 00:05:43.952 EAL: Detected lcore 50 as core 25 on socket 1 00:05:43.952 EAL: Detected lcore 51 as core 26 on socket 1 00:05:43.952 EAL: Detected lcore 52 as core 27 on socket 1 00:05:43.952 EAL: Detected lcore 53 as core 28 on socket 1 00:05:43.952 EAL: Detected lcore 54 as core 29 on socket 1 00:05:43.952 EAL: Detected lcore 55 as core 30 on socket 1 00:05:43.952 EAL: Detected lcore 56 as core 0 on socket 0 00:05:43.952 EAL: Detected lcore 57 as core 1 on socket 0 00:05:43.952 EAL: Detected lcore 58 as core 2 on socket 0 00:05:43.952 EAL: Detected lcore 59 as core 3 on socket 0 00:05:43.952 EAL: Detected lcore 60 as core 4 on socket 0 00:05:43.952 EAL: Detected lcore 61 as core 5 on socket 0 00:05:43.952 EAL: Detected lcore 62 as core 6 on socket 0 00:05:43.952 EAL: Detected lcore 63 as core 8 on socket 0 00:05:43.952 EAL: Detected lcore 64 as core 9 on socket 0 00:05:43.952 EAL: Detected lcore 65 as core 10 on socket 0 00:05:43.952 EAL: Detected lcore 66 as core 11 on socket 0 00:05:43.952 EAL: Detected lcore 67 as core 12 on socket 0 00:05:43.952 EAL: Detected lcore 68 as core 13 on socket 0 00:05:43.952 EAL: Detected lcore 69 as core 14 on socket 0 00:05:43.952 EAL: Detected lcore 70 as core 16 on socket 0 00:05:43.952 EAL: Detected lcore 71 as core 17 on socket 0 00:05:43.952 EAL: Detected lcore 72 as core 18 on socket 0 00:05:43.952 EAL: Detected lcore 73 as core 19 on socket 0 00:05:43.952 EAL: Detected lcore 74 as core 20 on socket 0 00:05:43.952 EAL: Detected lcore 75 as core 21 on socket 0 00:05:43.952 EAL: Detected lcore 76 as core 22 on socket 0 00:05:43.952 EAL: Detected lcore 77 as core 24 on socket 0 00:05:43.952 EAL: Detected lcore 78 as core 25 on socket 0 00:05:43.952 EAL: Detected lcore 79 as core 26 on socket 0 00:05:43.952 EAL: Detected lcore 80 as core 27 on socket 0 00:05:43.952 EAL: Detected lcore 81 as core 28 on socket 0 00:05:43.952 EAL: Detected lcore 82 as core 29 on socket 0 00:05:43.952 EAL: Detected lcore 83 as core 30 on socket 0 00:05:43.952 EAL: Detected lcore 84 as core 0 on socket 1 00:05:43.952 EAL: Detected lcore 85 as core 1 on socket 1 00:05:43.952 EAL: Detected lcore 86 as core 2 on socket 1 00:05:43.952 EAL: Detected lcore 87 as core 3 on socket 1 00:05:43.952 EAL: Detected lcore 88 as core 4 on socket 1 00:05:43.952 EAL: Detected lcore 89 as core 5 on socket 1 00:05:43.952 EAL: Detected lcore 90 as core 6 on socket 1 00:05:43.952 EAL: Detected lcore 91 as core 8 on socket 1 00:05:43.952 EAL: Detected lcore 92 as core 9 on socket 1 00:05:43.952 EAL: Detected lcore 93 as core 10 on socket 1 00:05:43.952 EAL: Detected lcore 94 as core 11 on socket 1 00:05:43.952 EAL: Detected lcore 95 as core 12 on socket 1 00:05:43.952 EAL: Detected lcore 96 as core 13 on socket 1 00:05:43.952 EAL: Detected lcore 97 as core 14 on socket 1 00:05:43.952 EAL: Detected lcore 98 as core 16 on socket 1 00:05:43.952 EAL: Detected lcore 99 as core 17 on socket 1 00:05:43.952 EAL: Detected lcore 100 as core 18 on socket 1 00:05:43.952 EAL: Detected lcore 101 as core 19 on socket 1 00:05:43.952 EAL: Detected lcore 102 as core 20 on socket 1 00:05:43.952 EAL: Detected lcore 103 as core 21 on socket 1 00:05:43.952 EAL: Detected lcore 104 as core 22 on socket 1 00:05:43.952 EAL: Detected lcore 105 as core 24 on socket 1 00:05:43.952 EAL: Detected lcore 106 as core 25 on socket 1 00:05:43.952 EAL: Detected lcore 107 as core 26 on socket 1 00:05:43.952 EAL: Detected lcore 108 as core 27 on socket 1 00:05:43.952 EAL: Detected lcore 109 as core 28 on socket 1 00:05:43.952 EAL: Detected lcore 110 as core 29 on socket 1 00:05:43.952 EAL: Detected lcore 111 as core 30 on socket 1 00:05:43.952 EAL: Maximum logical cores by configuration: 128 00:05:43.952 EAL: Detected CPU lcores: 112 00:05:43.952 EAL: Detected NUMA nodes: 2 00:05:43.952 EAL: Checking presence of .so 'librte_eal.so.24.2' 00:05:43.952 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:43.952 EAL: Checking presence of .so 'librte_eal.so' 00:05:43.952 EAL: Detected static linkage of DPDK 00:05:43.952 EAL: No shared files mode enabled, IPC will be disabled 00:05:43.952 EAL: Bus pci wants IOVA as 'DC' 00:05:43.952 EAL: Buses did not request a specific IOVA mode. 00:05:43.952 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:43.952 EAL: Selected IOVA mode 'VA' 00:05:43.952 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.952 EAL: Probing VFIO support... 00:05:43.952 EAL: IOMMU type 1 (Type 1) is supported 00:05:43.952 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:43.952 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:43.952 EAL: VFIO support initialized 00:05:43.952 EAL: Ask a virtual area of 0x2e000 bytes 00:05:43.952 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:43.952 EAL: Setting up physically contiguous memory... 00:05:43.952 EAL: Setting maximum number of open files to 524288 00:05:43.952 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:43.952 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:43.952 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:43.952 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.952 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:43.952 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:43.952 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.952 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:43.952 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:43.952 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.952 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:43.952 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:43.952 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.952 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:43.952 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:43.952 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.952 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:43.952 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:43.952 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.952 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:43.952 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:43.952 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.952 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:43.952 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:43.952 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.952 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:43.952 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:43.952 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:43.952 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.952 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:43.952 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:43.952 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.952 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:43.952 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:43.952 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.952 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:43.952 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:43.952 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.953 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:43.953 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:43.953 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.953 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:43.953 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:43.953 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.953 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:43.953 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:43.953 EAL: Ask a virtual area of 0x61000 bytes 00:05:43.953 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:43.953 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:43.953 EAL: Ask a virtual area of 0x400000000 bytes 00:05:43.953 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:43.953 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:43.953 EAL: Hugepages will be freed exactly as allocated. 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: TSC frequency is ~2500000 KHz 00:05:43.953 EAL: Main lcore 0 is ready (tid=7f091a0dba00;cpuset=[0]) 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 0 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 2MB 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Mem event callback 'spdk:(nil)' registered 00:05:43.953 00:05:43.953 00:05:43.953 CUnit - A unit testing framework for C - Version 2.1-3 00:05:43.953 http://cunit.sourceforge.net/ 00:05:43.953 00:05:43.953 00:05:43.953 Suite: components_suite 00:05:43.953 Test: vtophys_malloc_test ...passed 00:05:43.953 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 4MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 4MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 6MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 6MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 10MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 10MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 18MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 18MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 34MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 34MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 66MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 66MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 130MB 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was shrunk by 130MB 00:05:43.953 EAL: Trying to obtain current memory policy. 00:05:43.953 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:43.953 EAL: Restoring previous memory policy: 4 00:05:43.953 EAL: Calling mem event callback 'spdk:(nil)' 00:05:43.953 EAL: request: mp_malloc_sync 00:05:43.953 EAL: No shared files mode enabled, IPC is disabled 00:05:43.953 EAL: Heap on socket 0 was expanded by 258MB 00:05:44.213 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.213 EAL: request: mp_malloc_sync 00:05:44.213 EAL: No shared files mode enabled, IPC is disabled 00:05:44.213 EAL: Heap on socket 0 was shrunk by 258MB 00:05:44.213 EAL: Trying to obtain current memory policy. 00:05:44.213 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.213 EAL: Restoring previous memory policy: 4 00:05:44.213 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.213 EAL: request: mp_malloc_sync 00:05:44.213 EAL: No shared files mode enabled, IPC is disabled 00:05:44.213 EAL: Heap on socket 0 was expanded by 514MB 00:05:44.213 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.472 EAL: request: mp_malloc_sync 00:05:44.472 EAL: No shared files mode enabled, IPC is disabled 00:05:44.472 EAL: Heap on socket 0 was shrunk by 514MB 00:05:44.472 EAL: Trying to obtain current memory policy. 00:05:44.472 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.472 EAL: Restoring previous memory policy: 4 00:05:44.472 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.472 EAL: request: mp_malloc_sync 00:05:44.472 EAL: No shared files mode enabled, IPC is disabled 00:05:44.472 EAL: Heap on socket 0 was expanded by 1026MB 00:05:44.732 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.992 EAL: request: mp_malloc_sync 00:05:44.992 EAL: No shared files mode enabled, IPC is disabled 00:05:44.992 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:44.992 passed 00:05:44.992 00:05:44.992 Run Summary: Type Total Ran Passed Failed Inactive 00:05:44.992 suites 1 1 n/a 0 0 00:05:44.992 tests 2 2 2 0 0 00:05:44.992 asserts 497 497 497 0 n/a 00:05:44.992 00:05:44.992 Elapsed time = 0.964 seconds 00:05:44.992 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.992 EAL: request: mp_malloc_sync 00:05:44.992 EAL: No shared files mode enabled, IPC is disabled 00:05:44.992 EAL: Heap on socket 0 was shrunk by 2MB 00:05:44.992 EAL: No shared files mode enabled, IPC is disabled 00:05:44.992 EAL: No shared files mode enabled, IPC is disabled 00:05:44.992 EAL: No shared files mode enabled, IPC is disabled 00:05:44.992 00:05:44.992 real 0m1.081s 00:05:44.992 user 0m0.623s 00:05:44.992 sys 0m0.430s 00:05:44.992 20:53:00 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:44.992 20:53:00 -- common/autotest_common.sh@10 -- # set +x 00:05:44.992 ************************************ 00:05:44.992 END TEST env_vtophys 00:05:44.992 ************************************ 00:05:44.992 20:53:00 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:44.992 20:53:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.992 20:53:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.992 20:53:00 -- common/autotest_common.sh@10 -- # set +x 00:05:44.992 ************************************ 00:05:44.992 START TEST env_pci 00:05:44.992 ************************************ 00:05:44.992 20:53:00 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:45.252 00:05:45.252 00:05:45.252 CUnit - A unit testing framework for C - Version 2.1-3 00:05:45.252 http://cunit.sourceforge.net/ 00:05:45.252 00:05:45.252 00:05:45.252 Suite: pci 00:05:45.252 Test: pci_hook ...[2024-04-25 20:53:00.668352] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 170076 has claimed it 00:05:45.252 EAL: Cannot find device (10000:00:01.0) 00:05:45.252 EAL: Failed to attach device on primary process 00:05:45.252 passed 00:05:45.252 00:05:45.252 Run Summary: Type Total Ran Passed Failed Inactive 00:05:45.252 suites 1 1 n/a 0 0 00:05:45.252 tests 1 1 1 0 0 00:05:45.252 asserts 25 25 25 0 n/a 00:05:45.252 00:05:45.252 Elapsed time = 0.038 seconds 00:05:45.252 00:05:45.252 real 0m0.056s 00:05:45.252 user 0m0.013s 00:05:45.252 sys 0m0.043s 00:05:45.252 20:53:00 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:45.252 20:53:00 -- common/autotest_common.sh@10 -- # set +x 00:05:45.252 ************************************ 00:05:45.252 END TEST env_pci 00:05:45.252 ************************************ 00:05:45.252 20:53:00 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:45.252 20:53:00 -- env/env.sh@15 -- # uname 00:05:45.252 20:53:00 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:45.252 20:53:00 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:45.252 20:53:00 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:45.252 20:53:00 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:45.252 20:53:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.252 20:53:00 -- common/autotest_common.sh@10 -- # set +x 00:05:45.511 ************************************ 00:05:45.511 START TEST env_dpdk_post_init 00:05:45.511 ************************************ 00:05:45.512 20:53:00 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:45.512 EAL: Detected CPU lcores: 112 00:05:45.512 EAL: Detected NUMA nodes: 2 00:05:45.512 EAL: Detected static linkage of DPDK 00:05:45.512 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:45.512 EAL: Selected IOVA mode 'VA' 00:05:45.512 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.512 EAL: VFIO support initialized 00:05:45.512 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:45.512 EAL: Using IOMMU type 1 (Type 1) 00:05:46.454 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:49.742 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:49.742 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:50.001 Starting DPDK initialization... 00:05:50.001 Starting SPDK post initialization... 00:05:50.001 SPDK NVMe probe 00:05:50.001 Attaching to 0000:d8:00.0 00:05:50.001 Attached to 0000:d8:00.0 00:05:50.001 Cleaning up... 00:05:50.001 00:05:50.001 real 0m4.698s 00:05:50.001 user 0m3.521s 00:05:50.001 sys 0m0.420s 00:05:50.001 20:53:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:50.001 20:53:05 -- common/autotest_common.sh@10 -- # set +x 00:05:50.001 ************************************ 00:05:50.001 END TEST env_dpdk_post_init 00:05:50.001 ************************************ 00:05:50.261 20:53:05 -- env/env.sh@26 -- # uname 00:05:50.261 20:53:05 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:50.261 20:53:05 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:50.261 20:53:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.261 20:53:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.261 20:53:05 -- common/autotest_common.sh@10 -- # set +x 00:05:50.261 ************************************ 00:05:50.261 START TEST env_mem_callbacks 00:05:50.261 ************************************ 00:05:50.261 20:53:05 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:50.261 EAL: Detected CPU lcores: 112 00:05:50.261 EAL: Detected NUMA nodes: 2 00:05:50.261 EAL: Detected static linkage of DPDK 00:05:50.261 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:50.261 EAL: Selected IOVA mode 'VA' 00:05:50.261 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.261 EAL: VFIO support initialized 00:05:50.261 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:50.261 00:05:50.261 00:05:50.261 CUnit - A unit testing framework for C - Version 2.1-3 00:05:50.261 http://cunit.sourceforge.net/ 00:05:50.261 00:05:50.261 00:05:50.261 Suite: memory 00:05:50.261 Test: test ... 00:05:50.261 register 0x200000200000 2097152 00:05:50.261 malloc 3145728 00:05:50.261 register 0x200000400000 4194304 00:05:50.261 buf 0x200000500000 len 3145728 PASSED 00:05:50.261 malloc 64 00:05:50.261 buf 0x2000004fff40 len 64 PASSED 00:05:50.261 malloc 4194304 00:05:50.261 register 0x200000800000 6291456 00:05:50.261 buf 0x200000a00000 len 4194304 PASSED 00:05:50.261 free 0x200000500000 3145728 00:05:50.261 free 0x2000004fff40 64 00:05:50.261 unregister 0x200000400000 4194304 PASSED 00:05:50.261 free 0x200000a00000 4194304 00:05:50.261 unregister 0x200000800000 6291456 PASSED 00:05:50.261 malloc 8388608 00:05:50.261 register 0x200000400000 10485760 00:05:50.261 buf 0x200000600000 len 8388608 PASSED 00:05:50.261 free 0x200000600000 8388608 00:05:50.261 unregister 0x200000400000 10485760 PASSED 00:05:50.261 passed 00:05:50.261 00:05:50.261 Run Summary: Type Total Ran Passed Failed Inactive 00:05:50.261 suites 1 1 n/a 0 0 00:05:50.261 tests 1 1 1 0 0 00:05:50.261 asserts 15 15 15 0 n/a 00:05:50.261 00:05:50.261 Elapsed time = 0.007 seconds 00:05:50.261 00:05:50.261 real 0m0.065s 00:05:50.261 user 0m0.020s 00:05:50.261 sys 0m0.044s 00:05:50.261 20:53:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:50.261 20:53:05 -- common/autotest_common.sh@10 -- # set +x 00:05:50.261 ************************************ 00:05:50.261 END TEST env_mem_callbacks 00:05:50.261 ************************************ 00:05:50.521 00:05:50.521 real 0m7.152s 00:05:50.521 user 0m4.679s 00:05:50.521 sys 0m1.607s 00:05:50.521 20:53:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:50.521 20:53:05 -- common/autotest_common.sh@10 -- # set +x 00:05:50.521 ************************************ 00:05:50.521 END TEST env 00:05:50.521 ************************************ 00:05:50.521 20:53:05 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:50.521 20:53:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.521 20:53:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.521 20:53:05 -- common/autotest_common.sh@10 -- # set +x 00:05:50.521 ************************************ 00:05:50.521 START TEST rpc 00:05:50.521 ************************************ 00:05:50.521 20:53:06 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:50.780 * Looking for test storage... 00:05:50.780 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:50.780 20:53:06 -- rpc/rpc.sh@65 -- # spdk_pid=171127 00:05:50.780 20:53:06 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:50.780 20:53:06 -- rpc/rpc.sh@67 -- # waitforlisten 171127 00:05:50.780 20:53:06 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:50.780 20:53:06 -- common/autotest_common.sh@817 -- # '[' -z 171127 ']' 00:05:50.780 20:53:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:50.780 20:53:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:50.780 20:53:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:50.780 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:50.780 20:53:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:50.780 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:50.780 [2024-04-25 20:53:06.271784] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:05:50.780 [2024-04-25 20:53:06.271849] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid171127 ] 00:05:50.780 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.780 [2024-04-25 20:53:06.306317] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:50.780 [2024-04-25 20:53:06.338261] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.780 [2024-04-25 20:53:06.375894] app.c: 523:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:50.780 [2024-04-25 20:53:06.375930] app.c: 527:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 171127' to capture a snapshot of events at runtime. 00:05:50.780 [2024-04-25 20:53:06.375940] app.c: 529:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:50.780 [2024-04-25 20:53:06.375949] app.c: 530:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:50.780 [2024-04-25 20:53:06.375956] app.c: 531:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid171127 for offline analysis/debug. 00:05:50.780 [2024-04-25 20:53:06.375975] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.039 20:53:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:51.040 20:53:06 -- common/autotest_common.sh@850 -- # return 0 00:05:51.040 20:53:06 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:51.040 20:53:06 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:51.040 20:53:06 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:51.040 20:53:06 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:51.040 20:53:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.040 20:53:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.040 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.040 ************************************ 00:05:51.040 START TEST rpc_integrity 00:05:51.040 ************************************ 00:05:51.040 20:53:06 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:05:51.040 20:53:06 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:51.040 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.040 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.040 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.040 20:53:06 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:51.040 20:53:06 -- rpc/rpc.sh@13 -- # jq length 00:05:51.298 20:53:06 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:51.298 20:53:06 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:51.298 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.298 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.298 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.298 20:53:06 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:51.298 20:53:06 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:51.298 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.298 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.298 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.298 20:53:06 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:51.298 { 00:05:51.298 "name": "Malloc0", 00:05:51.298 "aliases": [ 00:05:51.298 "883f1014-3c15-4bf2-a774-61262eb1701c" 00:05:51.298 ], 00:05:51.298 "product_name": "Malloc disk", 00:05:51.298 "block_size": 512, 00:05:51.298 "num_blocks": 16384, 00:05:51.298 "uuid": "883f1014-3c15-4bf2-a774-61262eb1701c", 00:05:51.298 "assigned_rate_limits": { 00:05:51.298 "rw_ios_per_sec": 0, 00:05:51.298 "rw_mbytes_per_sec": 0, 00:05:51.298 "r_mbytes_per_sec": 0, 00:05:51.298 "w_mbytes_per_sec": 0 00:05:51.298 }, 00:05:51.298 "claimed": false, 00:05:51.298 "zoned": false, 00:05:51.298 "supported_io_types": { 00:05:51.298 "read": true, 00:05:51.298 "write": true, 00:05:51.298 "unmap": true, 00:05:51.298 "write_zeroes": true, 00:05:51.298 "flush": true, 00:05:51.299 "reset": true, 00:05:51.299 "compare": false, 00:05:51.299 "compare_and_write": false, 00:05:51.299 "abort": true, 00:05:51.299 "nvme_admin": false, 00:05:51.299 "nvme_io": false 00:05:51.299 }, 00:05:51.299 "memory_domains": [ 00:05:51.299 { 00:05:51.299 "dma_device_id": "system", 00:05:51.299 "dma_device_type": 1 00:05:51.299 }, 00:05:51.299 { 00:05:51.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:51.299 "dma_device_type": 2 00:05:51.299 } 00:05:51.299 ], 00:05:51.299 "driver_specific": {} 00:05:51.299 } 00:05:51.299 ]' 00:05:51.299 20:53:06 -- rpc/rpc.sh@17 -- # jq length 00:05:51.299 20:53:06 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:51.299 20:53:06 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:51.299 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.299 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.299 [2024-04-25 20:53:06.814541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:51.299 [2024-04-25 20:53:06.814573] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:51.299 [2024-04-25 20:53:06.814594] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5f72920 00:05:51.299 [2024-04-25 20:53:06.814603] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:51.299 [2024-04-25 20:53:06.815448] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:51.299 [2024-04-25 20:53:06.815470] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:51.299 Passthru0 00:05:51.299 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.299 20:53:06 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:51.299 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.299 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.299 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.299 20:53:06 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:51.299 { 00:05:51.299 "name": "Malloc0", 00:05:51.299 "aliases": [ 00:05:51.299 "883f1014-3c15-4bf2-a774-61262eb1701c" 00:05:51.299 ], 00:05:51.299 "product_name": "Malloc disk", 00:05:51.299 "block_size": 512, 00:05:51.299 "num_blocks": 16384, 00:05:51.299 "uuid": "883f1014-3c15-4bf2-a774-61262eb1701c", 00:05:51.299 "assigned_rate_limits": { 00:05:51.299 "rw_ios_per_sec": 0, 00:05:51.299 "rw_mbytes_per_sec": 0, 00:05:51.299 "r_mbytes_per_sec": 0, 00:05:51.299 "w_mbytes_per_sec": 0 00:05:51.299 }, 00:05:51.299 "claimed": true, 00:05:51.299 "claim_type": "exclusive_write", 00:05:51.299 "zoned": false, 00:05:51.299 "supported_io_types": { 00:05:51.299 "read": true, 00:05:51.299 "write": true, 00:05:51.299 "unmap": true, 00:05:51.299 "write_zeroes": true, 00:05:51.299 "flush": true, 00:05:51.299 "reset": true, 00:05:51.299 "compare": false, 00:05:51.299 "compare_and_write": false, 00:05:51.299 "abort": true, 00:05:51.299 "nvme_admin": false, 00:05:51.299 "nvme_io": false 00:05:51.299 }, 00:05:51.299 "memory_domains": [ 00:05:51.299 { 00:05:51.299 "dma_device_id": "system", 00:05:51.299 "dma_device_type": 1 00:05:51.299 }, 00:05:51.299 { 00:05:51.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:51.299 "dma_device_type": 2 00:05:51.299 } 00:05:51.299 ], 00:05:51.299 "driver_specific": {} 00:05:51.299 }, 00:05:51.299 { 00:05:51.299 "name": "Passthru0", 00:05:51.299 "aliases": [ 00:05:51.299 "423052ad-89a0-5077-9022-7b3a5fe8dc46" 00:05:51.299 ], 00:05:51.299 "product_name": "passthru", 00:05:51.299 "block_size": 512, 00:05:51.299 "num_blocks": 16384, 00:05:51.299 "uuid": "423052ad-89a0-5077-9022-7b3a5fe8dc46", 00:05:51.299 "assigned_rate_limits": { 00:05:51.299 "rw_ios_per_sec": 0, 00:05:51.299 "rw_mbytes_per_sec": 0, 00:05:51.299 "r_mbytes_per_sec": 0, 00:05:51.299 "w_mbytes_per_sec": 0 00:05:51.299 }, 00:05:51.299 "claimed": false, 00:05:51.299 "zoned": false, 00:05:51.299 "supported_io_types": { 00:05:51.299 "read": true, 00:05:51.299 "write": true, 00:05:51.299 "unmap": true, 00:05:51.299 "write_zeroes": true, 00:05:51.299 "flush": true, 00:05:51.299 "reset": true, 00:05:51.299 "compare": false, 00:05:51.299 "compare_and_write": false, 00:05:51.299 "abort": true, 00:05:51.299 "nvme_admin": false, 00:05:51.299 "nvme_io": false 00:05:51.299 }, 00:05:51.299 "memory_domains": [ 00:05:51.299 { 00:05:51.299 "dma_device_id": "system", 00:05:51.299 "dma_device_type": 1 00:05:51.299 }, 00:05:51.299 { 00:05:51.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:51.299 "dma_device_type": 2 00:05:51.299 } 00:05:51.299 ], 00:05:51.299 "driver_specific": { 00:05:51.299 "passthru": { 00:05:51.299 "name": "Passthru0", 00:05:51.299 "base_bdev_name": "Malloc0" 00:05:51.299 } 00:05:51.299 } 00:05:51.299 } 00:05:51.299 ]' 00:05:51.299 20:53:06 -- rpc/rpc.sh@21 -- # jq length 00:05:51.299 20:53:06 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:51.299 20:53:06 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:51.299 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.299 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.299 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.299 20:53:06 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:51.299 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.299 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.299 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.299 20:53:06 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:51.299 20:53:06 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.299 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.299 20:53:06 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.299 20:53:06 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:51.299 20:53:06 -- rpc/rpc.sh@26 -- # jq length 00:05:51.299 20:53:06 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:51.299 00:05:51.299 real 0m0.266s 00:05:51.299 user 0m0.163s 00:05:51.299 sys 0m0.049s 00:05:51.299 20:53:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:51.299 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.299 ************************************ 00:05:51.299 END TEST rpc_integrity 00:05:51.299 ************************************ 00:05:51.558 20:53:06 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:51.558 20:53:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.559 20:53:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.559 20:53:06 -- common/autotest_common.sh@10 -- # set +x 00:05:51.559 ************************************ 00:05:51.559 START TEST rpc_plugins 00:05:51.559 ************************************ 00:05:51.559 20:53:07 -- common/autotest_common.sh@1111 -- # rpc_plugins 00:05:51.559 20:53:07 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:51.559 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.559 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.559 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.559 20:53:07 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:51.559 20:53:07 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:51.559 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.559 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.559 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.559 20:53:07 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:51.559 { 00:05:51.559 "name": "Malloc1", 00:05:51.559 "aliases": [ 00:05:51.559 "698880f8-c2a4-4d04-84a7-d94b0663b609" 00:05:51.559 ], 00:05:51.559 "product_name": "Malloc disk", 00:05:51.559 "block_size": 4096, 00:05:51.559 "num_blocks": 256, 00:05:51.559 "uuid": "698880f8-c2a4-4d04-84a7-d94b0663b609", 00:05:51.559 "assigned_rate_limits": { 00:05:51.559 "rw_ios_per_sec": 0, 00:05:51.559 "rw_mbytes_per_sec": 0, 00:05:51.559 "r_mbytes_per_sec": 0, 00:05:51.559 "w_mbytes_per_sec": 0 00:05:51.559 }, 00:05:51.559 "claimed": false, 00:05:51.559 "zoned": false, 00:05:51.559 "supported_io_types": { 00:05:51.559 "read": true, 00:05:51.559 "write": true, 00:05:51.559 "unmap": true, 00:05:51.559 "write_zeroes": true, 00:05:51.559 "flush": true, 00:05:51.559 "reset": true, 00:05:51.559 "compare": false, 00:05:51.559 "compare_and_write": false, 00:05:51.559 "abort": true, 00:05:51.559 "nvme_admin": false, 00:05:51.559 "nvme_io": false 00:05:51.559 }, 00:05:51.559 "memory_domains": [ 00:05:51.559 { 00:05:51.559 "dma_device_id": "system", 00:05:51.559 "dma_device_type": 1 00:05:51.559 }, 00:05:51.559 { 00:05:51.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:51.559 "dma_device_type": 2 00:05:51.559 } 00:05:51.559 ], 00:05:51.559 "driver_specific": {} 00:05:51.559 } 00:05:51.559 ]' 00:05:51.559 20:53:07 -- rpc/rpc.sh@32 -- # jq length 00:05:51.559 20:53:07 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:51.559 20:53:07 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:51.559 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.559 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.559 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.559 20:53:07 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:51.559 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.559 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.559 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.559 20:53:07 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:51.559 20:53:07 -- rpc/rpc.sh@36 -- # jq length 00:05:51.817 20:53:07 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:51.817 00:05:51.817 real 0m0.126s 00:05:51.817 user 0m0.082s 00:05:51.817 sys 0m0.017s 00:05:51.817 20:53:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:51.817 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.817 ************************************ 00:05:51.817 END TEST rpc_plugins 00:05:51.817 ************************************ 00:05:51.817 20:53:07 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:51.817 20:53:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.817 20:53:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.817 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.817 ************************************ 00:05:51.817 START TEST rpc_trace_cmd_test 00:05:51.817 ************************************ 00:05:51.817 20:53:07 -- common/autotest_common.sh@1111 -- # rpc_trace_cmd_test 00:05:51.817 20:53:07 -- rpc/rpc.sh@40 -- # local info 00:05:51.817 20:53:07 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:51.817 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.817 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.817 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.817 20:53:07 -- rpc/rpc.sh@42 -- # info='{ 00:05:51.817 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid171127", 00:05:51.817 "tpoint_group_mask": "0x8", 00:05:51.817 "iscsi_conn": { 00:05:51.817 "mask": "0x2", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "scsi": { 00:05:51.817 "mask": "0x4", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "bdev": { 00:05:51.817 "mask": "0x8", 00:05:51.817 "tpoint_mask": "0xffffffffffffffff" 00:05:51.817 }, 00:05:51.817 "nvmf_rdma": { 00:05:51.817 "mask": "0x10", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "nvmf_tcp": { 00:05:51.817 "mask": "0x20", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "ftl": { 00:05:51.817 "mask": "0x40", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "blobfs": { 00:05:51.817 "mask": "0x80", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "dsa": { 00:05:51.817 "mask": "0x200", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "thread": { 00:05:51.817 "mask": "0x400", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "nvme_pcie": { 00:05:51.817 "mask": "0x800", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "iaa": { 00:05:51.817 "mask": "0x1000", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "nvme_tcp": { 00:05:51.817 "mask": "0x2000", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "bdev_nvme": { 00:05:51.817 "mask": "0x4000", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 }, 00:05:51.817 "sock": { 00:05:51.817 "mask": "0x8000", 00:05:51.817 "tpoint_mask": "0x0" 00:05:51.817 } 00:05:51.817 }' 00:05:51.817 20:53:07 -- rpc/rpc.sh@43 -- # jq length 00:05:52.075 20:53:07 -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:52.075 20:53:07 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:52.075 20:53:07 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:52.075 20:53:07 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:52.075 20:53:07 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:52.075 20:53:07 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:52.075 20:53:07 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:52.075 20:53:07 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:52.076 20:53:07 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:52.076 00:05:52.076 real 0m0.214s 00:05:52.076 user 0m0.174s 00:05:52.076 sys 0m0.031s 00:05:52.076 20:53:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:52.076 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.076 ************************************ 00:05:52.076 END TEST rpc_trace_cmd_test 00:05:52.076 ************************************ 00:05:52.076 20:53:07 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:52.076 20:53:07 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:52.076 20:53:07 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:52.076 20:53:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:52.076 20:53:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.076 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.334 ************************************ 00:05:52.334 START TEST rpc_daemon_integrity 00:05:52.334 ************************************ 00:05:52.334 20:53:07 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:05:52.334 20:53:07 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:52.334 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.334 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.334 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.334 20:53:07 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:52.334 20:53:07 -- rpc/rpc.sh@13 -- # jq length 00:05:52.334 20:53:07 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:52.334 20:53:07 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:52.334 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.334 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.334 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.334 20:53:07 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:52.334 20:53:07 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:52.334 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.334 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.334 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.334 20:53:07 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:52.334 { 00:05:52.334 "name": "Malloc2", 00:05:52.334 "aliases": [ 00:05:52.334 "51b404d3-91a9-45f9-bc34-b251c0356dcb" 00:05:52.334 ], 00:05:52.334 "product_name": "Malloc disk", 00:05:52.334 "block_size": 512, 00:05:52.334 "num_blocks": 16384, 00:05:52.334 "uuid": "51b404d3-91a9-45f9-bc34-b251c0356dcb", 00:05:52.334 "assigned_rate_limits": { 00:05:52.334 "rw_ios_per_sec": 0, 00:05:52.334 "rw_mbytes_per_sec": 0, 00:05:52.334 "r_mbytes_per_sec": 0, 00:05:52.334 "w_mbytes_per_sec": 0 00:05:52.334 }, 00:05:52.334 "claimed": false, 00:05:52.334 "zoned": false, 00:05:52.334 "supported_io_types": { 00:05:52.334 "read": true, 00:05:52.334 "write": true, 00:05:52.334 "unmap": true, 00:05:52.334 "write_zeroes": true, 00:05:52.334 "flush": true, 00:05:52.334 "reset": true, 00:05:52.334 "compare": false, 00:05:52.334 "compare_and_write": false, 00:05:52.334 "abort": true, 00:05:52.334 "nvme_admin": false, 00:05:52.334 "nvme_io": false 00:05:52.334 }, 00:05:52.334 "memory_domains": [ 00:05:52.334 { 00:05:52.334 "dma_device_id": "system", 00:05:52.334 "dma_device_type": 1 00:05:52.334 }, 00:05:52.334 { 00:05:52.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:52.334 "dma_device_type": 2 00:05:52.334 } 00:05:52.334 ], 00:05:52.334 "driver_specific": {} 00:05:52.334 } 00:05:52.334 ]' 00:05:52.334 20:53:07 -- rpc/rpc.sh@17 -- # jq length 00:05:52.334 20:53:07 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:52.334 20:53:07 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:52.334 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.334 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.334 [2024-04-25 20:53:07.989581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:52.334 [2024-04-25 20:53:07.989612] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:52.334 [2024-04-25 20:53:07.989627] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x61067f0 00:05:52.334 [2024-04-25 20:53:07.989640] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:52.334 [2024-04-25 20:53:07.990338] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:52.334 [2024-04-25 20:53:07.990359] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:52.334 Passthru0 00:05:52.334 20:53:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.334 20:53:07 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:52.334 20:53:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.334 20:53:07 -- common/autotest_common.sh@10 -- # set +x 00:05:52.593 20:53:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.593 20:53:08 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:52.593 { 00:05:52.593 "name": "Malloc2", 00:05:52.593 "aliases": [ 00:05:52.593 "51b404d3-91a9-45f9-bc34-b251c0356dcb" 00:05:52.593 ], 00:05:52.593 "product_name": "Malloc disk", 00:05:52.593 "block_size": 512, 00:05:52.593 "num_blocks": 16384, 00:05:52.593 "uuid": "51b404d3-91a9-45f9-bc34-b251c0356dcb", 00:05:52.593 "assigned_rate_limits": { 00:05:52.593 "rw_ios_per_sec": 0, 00:05:52.593 "rw_mbytes_per_sec": 0, 00:05:52.593 "r_mbytes_per_sec": 0, 00:05:52.593 "w_mbytes_per_sec": 0 00:05:52.593 }, 00:05:52.593 "claimed": true, 00:05:52.593 "claim_type": "exclusive_write", 00:05:52.593 "zoned": false, 00:05:52.593 "supported_io_types": { 00:05:52.593 "read": true, 00:05:52.593 "write": true, 00:05:52.593 "unmap": true, 00:05:52.593 "write_zeroes": true, 00:05:52.593 "flush": true, 00:05:52.593 "reset": true, 00:05:52.593 "compare": false, 00:05:52.593 "compare_and_write": false, 00:05:52.593 "abort": true, 00:05:52.593 "nvme_admin": false, 00:05:52.593 "nvme_io": false 00:05:52.593 }, 00:05:52.593 "memory_domains": [ 00:05:52.593 { 00:05:52.593 "dma_device_id": "system", 00:05:52.593 "dma_device_type": 1 00:05:52.593 }, 00:05:52.593 { 00:05:52.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:52.593 "dma_device_type": 2 00:05:52.593 } 00:05:52.593 ], 00:05:52.593 "driver_specific": {} 00:05:52.593 }, 00:05:52.593 { 00:05:52.593 "name": "Passthru0", 00:05:52.593 "aliases": [ 00:05:52.593 "3a973258-e9dd-5a6f-88c0-1674829b2355" 00:05:52.593 ], 00:05:52.593 "product_name": "passthru", 00:05:52.593 "block_size": 512, 00:05:52.593 "num_blocks": 16384, 00:05:52.593 "uuid": "3a973258-e9dd-5a6f-88c0-1674829b2355", 00:05:52.593 "assigned_rate_limits": { 00:05:52.593 "rw_ios_per_sec": 0, 00:05:52.593 "rw_mbytes_per_sec": 0, 00:05:52.593 "r_mbytes_per_sec": 0, 00:05:52.593 "w_mbytes_per_sec": 0 00:05:52.593 }, 00:05:52.593 "claimed": false, 00:05:52.593 "zoned": false, 00:05:52.593 "supported_io_types": { 00:05:52.593 "read": true, 00:05:52.593 "write": true, 00:05:52.593 "unmap": true, 00:05:52.593 "write_zeroes": true, 00:05:52.593 "flush": true, 00:05:52.593 "reset": true, 00:05:52.593 "compare": false, 00:05:52.593 "compare_and_write": false, 00:05:52.593 "abort": true, 00:05:52.593 "nvme_admin": false, 00:05:52.593 "nvme_io": false 00:05:52.593 }, 00:05:52.593 "memory_domains": [ 00:05:52.593 { 00:05:52.593 "dma_device_id": "system", 00:05:52.593 "dma_device_type": 1 00:05:52.593 }, 00:05:52.593 { 00:05:52.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:52.593 "dma_device_type": 2 00:05:52.593 } 00:05:52.593 ], 00:05:52.593 "driver_specific": { 00:05:52.593 "passthru": { 00:05:52.593 "name": "Passthru0", 00:05:52.593 "base_bdev_name": "Malloc2" 00:05:52.593 } 00:05:52.593 } 00:05:52.593 } 00:05:52.593 ]' 00:05:52.593 20:53:08 -- rpc/rpc.sh@21 -- # jq length 00:05:52.593 20:53:08 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:52.593 20:53:08 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:52.593 20:53:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.593 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:52.593 20:53:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.593 20:53:08 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:52.593 20:53:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.593 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:52.593 20:53:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.593 20:53:08 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:52.593 20:53:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:52.593 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:52.593 20:53:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:52.593 20:53:08 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:52.593 20:53:08 -- rpc/rpc.sh@26 -- # jq length 00:05:52.593 20:53:08 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:52.593 00:05:52.593 real 0m0.254s 00:05:52.593 user 0m0.167s 00:05:52.593 sys 0m0.033s 00:05:52.593 20:53:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:52.593 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:52.593 ************************************ 00:05:52.593 END TEST rpc_daemon_integrity 00:05:52.593 ************************************ 00:05:52.593 20:53:08 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:52.593 20:53:08 -- rpc/rpc.sh@84 -- # killprocess 171127 00:05:52.593 20:53:08 -- common/autotest_common.sh@936 -- # '[' -z 171127 ']' 00:05:52.593 20:53:08 -- common/autotest_common.sh@940 -- # kill -0 171127 00:05:52.593 20:53:08 -- common/autotest_common.sh@941 -- # uname 00:05:52.593 20:53:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:52.593 20:53:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 171127 00:05:52.593 20:53:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:52.593 20:53:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:52.593 20:53:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 171127' 00:05:52.593 killing process with pid 171127 00:05:52.593 20:53:08 -- common/autotest_common.sh@955 -- # kill 171127 00:05:52.593 20:53:08 -- common/autotest_common.sh@960 -- # wait 171127 00:05:53.160 00:05:53.160 real 0m2.371s 00:05:53.160 user 0m3.050s 00:05:53.160 sys 0m0.941s 00:05:53.160 20:53:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:53.160 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:53.160 ************************************ 00:05:53.160 END TEST rpc 00:05:53.160 ************************************ 00:05:53.160 20:53:08 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:53.160 20:53:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:53.160 20:53:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:53.160 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:53.160 ************************************ 00:05:53.160 START TEST skip_rpc 00:05:53.160 ************************************ 00:05:53.160 20:53:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:53.419 * Looking for test storage... 00:05:53.419 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:53.419 20:53:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:53.419 20:53:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:53.419 20:53:08 -- common/autotest_common.sh@10 -- # set +x 00:05:53.419 ************************************ 00:05:53.419 START TEST skip_rpc 00:05:53.419 ************************************ 00:05:53.419 20:53:08 -- common/autotest_common.sh@1111 -- # test_skip_rpc 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@16 -- # local spdk_pid=171869 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:53.419 20:53:08 -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:53.419 [2024-04-25 20:53:09.011119] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:05:53.419 [2024-04-25 20:53:09.011198] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid171869 ] 00:05:53.419 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.419 [2024-04-25 20:53:09.047430] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:53.419 [2024-04-25 20:53:09.078794] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.678 [2024-04-25 20:53:09.115449] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.948 20:53:13 -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:58.948 20:53:13 -- common/autotest_common.sh@638 -- # local es=0 00:05:58.948 20:53:13 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:58.948 20:53:13 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:05:58.948 20:53:13 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:58.948 20:53:13 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:05:58.948 20:53:13 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:58.948 20:53:13 -- common/autotest_common.sh@641 -- # rpc_cmd spdk_get_version 00:05:58.948 20:53:13 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:58.948 20:53:13 -- common/autotest_common.sh@10 -- # set +x 00:05:58.948 20:53:13 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:58.948 20:53:13 -- common/autotest_common.sh@641 -- # es=1 00:05:58.948 20:53:13 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:58.948 20:53:13 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:58.948 20:53:13 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:58.948 20:53:13 -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:58.948 20:53:13 -- rpc/skip_rpc.sh@23 -- # killprocess 171869 00:05:58.948 20:53:13 -- common/autotest_common.sh@936 -- # '[' -z 171869 ']' 00:05:58.948 20:53:13 -- common/autotest_common.sh@940 -- # kill -0 171869 00:05:58.948 20:53:13 -- common/autotest_common.sh@941 -- # uname 00:05:58.948 20:53:13 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:58.948 20:53:14 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 171869 00:05:58.948 20:53:14 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:58.948 20:53:14 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:58.948 20:53:14 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 171869' 00:05:58.948 killing process with pid 171869 00:05:58.948 20:53:14 -- common/autotest_common.sh@955 -- # kill 171869 00:05:58.948 20:53:14 -- common/autotest_common.sh@960 -- # wait 171869 00:05:58.948 00:05:58.948 real 0m5.351s 00:05:58.948 user 0m5.106s 00:05:58.948 sys 0m0.276s 00:05:58.948 20:53:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:58.948 20:53:14 -- common/autotest_common.sh@10 -- # set +x 00:05:58.948 ************************************ 00:05:58.948 END TEST skip_rpc 00:05:58.948 ************************************ 00:05:58.948 20:53:14 -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:58.948 20:53:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:58.948 20:53:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:58.948 20:53:14 -- common/autotest_common.sh@10 -- # set +x 00:05:58.948 ************************************ 00:05:58.948 START TEST skip_rpc_with_json 00:05:58.948 ************************************ 00:05:58.948 20:53:14 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_json 00:05:58.948 20:53:14 -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:58.948 20:53:14 -- rpc/skip_rpc.sh@28 -- # local spdk_pid=172844 00:05:58.948 20:53:14 -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:58.948 20:53:14 -- rpc/skip_rpc.sh@31 -- # waitforlisten 172844 00:05:58.948 20:53:14 -- common/autotest_common.sh@817 -- # '[' -z 172844 ']' 00:05:58.948 20:53:14 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:58.948 20:53:14 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:58.948 20:53:14 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:58.948 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:58.948 20:53:14 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:58.948 20:53:14 -- common/autotest_common.sh@10 -- # set +x 00:05:58.948 20:53:14 -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:58.948 [2024-04-25 20:53:14.544810] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:05:58.948 [2024-04-25 20:53:14.544865] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid172844 ] 00:05:58.948 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.948 [2024-04-25 20:53:14.580327] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:05:59.207 [2024-04-25 20:53:14.612517] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.207 [2024-04-25 20:53:14.649747] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.207 20:53:14 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:59.207 20:53:14 -- common/autotest_common.sh@850 -- # return 0 00:05:59.207 20:53:14 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:59.207 20:53:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:59.207 20:53:14 -- common/autotest_common.sh@10 -- # set +x 00:05:59.207 [2024-04-25 20:53:14.834356] nvmf_rpc.c:2513:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:59.207 request: 00:05:59.207 { 00:05:59.207 "trtype": "tcp", 00:05:59.207 "method": "nvmf_get_transports", 00:05:59.207 "req_id": 1 00:05:59.207 } 00:05:59.207 Got JSON-RPC error response 00:05:59.207 response: 00:05:59.207 { 00:05:59.207 "code": -19, 00:05:59.207 "message": "No such device" 00:05:59.207 } 00:05:59.207 20:53:14 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:59.207 20:53:14 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:59.207 20:53:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:59.207 20:53:14 -- common/autotest_common.sh@10 -- # set +x 00:05:59.207 [2024-04-25 20:53:14.842431] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:59.207 20:53:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:59.207 20:53:14 -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:59.207 20:53:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:59.207 20:53:14 -- common/autotest_common.sh@10 -- # set +x 00:05:59.466 20:53:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:59.466 20:53:14 -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:59.466 { 00:05:59.466 "subsystems": [ 00:05:59.466 { 00:05:59.466 "subsystem": "scheduler", 00:05:59.466 "config": [ 00:05:59.466 { 00:05:59.466 "method": "framework_set_scheduler", 00:05:59.466 "params": { 00:05:59.466 "name": "static" 00:05:59.466 } 00:05:59.466 } 00:05:59.466 ] 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "subsystem": "vmd", 00:05:59.466 "config": [] 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "subsystem": "sock", 00:05:59.466 "config": [ 00:05:59.466 { 00:05:59.466 "method": "sock_impl_set_options", 00:05:59.466 "params": { 00:05:59.466 "impl_name": "posix", 00:05:59.466 "recv_buf_size": 2097152, 00:05:59.466 "send_buf_size": 2097152, 00:05:59.466 "enable_recv_pipe": true, 00:05:59.466 "enable_quickack": false, 00:05:59.466 "enable_placement_id": 0, 00:05:59.466 "enable_zerocopy_send_server": true, 00:05:59.466 "enable_zerocopy_send_client": false, 00:05:59.466 "zerocopy_threshold": 0, 00:05:59.466 "tls_version": 0, 00:05:59.466 "enable_ktls": false 00:05:59.466 } 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "method": "sock_impl_set_options", 00:05:59.466 "params": { 00:05:59.466 "impl_name": "ssl", 00:05:59.466 "recv_buf_size": 4096, 00:05:59.466 "send_buf_size": 4096, 00:05:59.466 "enable_recv_pipe": true, 00:05:59.466 "enable_quickack": false, 00:05:59.466 "enable_placement_id": 0, 00:05:59.466 "enable_zerocopy_send_server": true, 00:05:59.466 "enable_zerocopy_send_client": false, 00:05:59.466 "zerocopy_threshold": 0, 00:05:59.466 "tls_version": 0, 00:05:59.466 "enable_ktls": false 00:05:59.466 } 00:05:59.466 } 00:05:59.466 ] 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "subsystem": "iobuf", 00:05:59.466 "config": [ 00:05:59.466 { 00:05:59.466 "method": "iobuf_set_options", 00:05:59.466 "params": { 00:05:59.466 "small_pool_count": 8192, 00:05:59.466 "large_pool_count": 1024, 00:05:59.466 "small_bufsize": 8192, 00:05:59.466 "large_bufsize": 135168 00:05:59.466 } 00:05:59.466 } 00:05:59.466 ] 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "subsystem": "keyring", 00:05:59.466 "config": [] 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "subsystem": "vfio_user_target", 00:05:59.466 "config": null 00:05:59.466 }, 00:05:59.466 { 00:05:59.466 "subsystem": "accel", 00:05:59.466 "config": [ 00:05:59.466 { 00:05:59.466 "method": "accel_set_options", 00:05:59.467 "params": { 00:05:59.467 "small_cache_size": 128, 00:05:59.467 "large_cache_size": 16, 00:05:59.467 "task_count": 2048, 00:05:59.467 "sequence_count": 2048, 00:05:59.467 "buf_count": 2048 00:05:59.467 } 00:05:59.467 } 00:05:59.467 ] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "bdev", 00:05:59.467 "config": [ 00:05:59.467 { 00:05:59.467 "method": "bdev_set_options", 00:05:59.467 "params": { 00:05:59.467 "bdev_io_pool_size": 65535, 00:05:59.467 "bdev_io_cache_size": 256, 00:05:59.467 "bdev_auto_examine": true, 00:05:59.467 "iobuf_small_cache_size": 128, 00:05:59.467 "iobuf_large_cache_size": 16 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "bdev_raid_set_options", 00:05:59.467 "params": { 00:05:59.467 "process_window_size_kb": 1024 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "bdev_nvme_set_options", 00:05:59.467 "params": { 00:05:59.467 "action_on_timeout": "none", 00:05:59.467 "timeout_us": 0, 00:05:59.467 "timeout_admin_us": 0, 00:05:59.467 "keep_alive_timeout_ms": 10000, 00:05:59.467 "arbitration_burst": 0, 00:05:59.467 "low_priority_weight": 0, 00:05:59.467 "medium_priority_weight": 0, 00:05:59.467 "high_priority_weight": 0, 00:05:59.467 "nvme_adminq_poll_period_us": 10000, 00:05:59.467 "nvme_ioq_poll_period_us": 0, 00:05:59.467 "io_queue_requests": 0, 00:05:59.467 "delay_cmd_submit": true, 00:05:59.467 "transport_retry_count": 4, 00:05:59.467 "bdev_retry_count": 3, 00:05:59.467 "transport_ack_timeout": 0, 00:05:59.467 "ctrlr_loss_timeout_sec": 0, 00:05:59.467 "reconnect_delay_sec": 0, 00:05:59.467 "fast_io_fail_timeout_sec": 0, 00:05:59.467 "disable_auto_failback": false, 00:05:59.467 "generate_uuids": false, 00:05:59.467 "transport_tos": 0, 00:05:59.467 "nvme_error_stat": false, 00:05:59.467 "rdma_srq_size": 0, 00:05:59.467 "io_path_stat": false, 00:05:59.467 "allow_accel_sequence": false, 00:05:59.467 "rdma_max_cq_size": 0, 00:05:59.467 "rdma_cm_event_timeout_ms": 0, 00:05:59.467 "dhchap_digests": [ 00:05:59.467 "sha256", 00:05:59.467 "sha384", 00:05:59.467 "sha512" 00:05:59.467 ], 00:05:59.467 "dhchap_dhgroups": [ 00:05:59.467 "null", 00:05:59.467 "ffdhe2048", 00:05:59.467 "ffdhe3072", 00:05:59.467 "ffdhe4096", 00:05:59.467 "ffdhe6144", 00:05:59.467 "ffdhe8192" 00:05:59.467 ] 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "bdev_nvme_set_hotplug", 00:05:59.467 "params": { 00:05:59.467 "period_us": 100000, 00:05:59.467 "enable": false 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "bdev_iscsi_set_options", 00:05:59.467 "params": { 00:05:59.467 "timeout_sec": 30 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "bdev_wait_for_examine" 00:05:59.467 } 00:05:59.467 ] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "nvmf", 00:05:59.467 "config": [ 00:05:59.467 { 00:05:59.467 "method": "nvmf_set_config", 00:05:59.467 "params": { 00:05:59.467 "discovery_filter": "match_any", 00:05:59.467 "admin_cmd_passthru": { 00:05:59.467 "identify_ctrlr": false 00:05:59.467 } 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "nvmf_set_max_subsystems", 00:05:59.467 "params": { 00:05:59.467 "max_subsystems": 1024 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "nvmf_set_crdt", 00:05:59.467 "params": { 00:05:59.467 "crdt1": 0, 00:05:59.467 "crdt2": 0, 00:05:59.467 "crdt3": 0 00:05:59.467 } 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "method": "nvmf_create_transport", 00:05:59.467 "params": { 00:05:59.467 "trtype": "TCP", 00:05:59.467 "max_queue_depth": 128, 00:05:59.467 "max_io_qpairs_per_ctrlr": 127, 00:05:59.467 "in_capsule_data_size": 4096, 00:05:59.467 "max_io_size": 131072, 00:05:59.467 "io_unit_size": 131072, 00:05:59.467 "max_aq_depth": 128, 00:05:59.467 "num_shared_buffers": 511, 00:05:59.467 "buf_cache_size": 4294967295, 00:05:59.467 "dif_insert_or_strip": false, 00:05:59.467 "zcopy": false, 00:05:59.467 "c2h_success": true, 00:05:59.467 "sock_priority": 0, 00:05:59.467 "abort_timeout_sec": 1, 00:05:59.467 "ack_timeout": 0, 00:05:59.467 "data_wr_pool_size": 0 00:05:59.467 } 00:05:59.467 } 00:05:59.467 ] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "nbd", 00:05:59.467 "config": [] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "ublk", 00:05:59.467 "config": [] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "vhost_blk", 00:05:59.467 "config": [] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "scsi", 00:05:59.467 "config": null 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "iscsi", 00:05:59.467 "config": [ 00:05:59.467 { 00:05:59.467 "method": "iscsi_set_options", 00:05:59.467 "params": { 00:05:59.467 "node_base": "iqn.2016-06.io.spdk", 00:05:59.467 "max_sessions": 128, 00:05:59.467 "max_connections_per_session": 2, 00:05:59.467 "max_queue_depth": 64, 00:05:59.467 "default_time2wait": 2, 00:05:59.467 "default_time2retain": 20, 00:05:59.467 "first_burst_length": 8192, 00:05:59.467 "immediate_data": true, 00:05:59.467 "allow_duplicated_isid": false, 00:05:59.467 "error_recovery_level": 0, 00:05:59.467 "nop_timeout": 60, 00:05:59.467 "nop_in_interval": 30, 00:05:59.467 "disable_chap": false, 00:05:59.467 "require_chap": false, 00:05:59.467 "mutual_chap": false, 00:05:59.467 "chap_group": 0, 00:05:59.467 "max_large_datain_per_connection": 64, 00:05:59.467 "max_r2t_per_connection": 4, 00:05:59.467 "pdu_pool_size": 36864, 00:05:59.467 "immediate_data_pool_size": 16384, 00:05:59.467 "data_out_pool_size": 2048 00:05:59.467 } 00:05:59.467 } 00:05:59.467 ] 00:05:59.467 }, 00:05:59.467 { 00:05:59.467 "subsystem": "vhost_scsi", 00:05:59.467 "config": [] 00:05:59.467 } 00:05:59.467 ] 00:05:59.467 } 00:05:59.467 20:53:14 -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:59.467 20:53:14 -- rpc/skip_rpc.sh@40 -- # killprocess 172844 00:05:59.467 20:53:14 -- common/autotest_common.sh@936 -- # '[' -z 172844 ']' 00:05:59.467 20:53:14 -- common/autotest_common.sh@940 -- # kill -0 172844 00:05:59.467 20:53:15 -- common/autotest_common.sh@941 -- # uname 00:05:59.467 20:53:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:59.467 20:53:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 172844 00:05:59.467 20:53:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:59.467 20:53:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:59.467 20:53:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 172844' 00:05:59.467 killing process with pid 172844 00:05:59.467 20:53:15 -- common/autotest_common.sh@955 -- # kill 172844 00:05:59.467 20:53:15 -- common/autotest_common.sh@960 -- # wait 172844 00:05:59.726 20:53:15 -- rpc/skip_rpc.sh@47 -- # local spdk_pid=172983 00:05:59.726 20:53:15 -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:59.726 20:53:15 -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:04.997 20:53:20 -- rpc/skip_rpc.sh@50 -- # killprocess 172983 00:06:04.997 20:53:20 -- common/autotest_common.sh@936 -- # '[' -z 172983 ']' 00:06:04.997 20:53:20 -- common/autotest_common.sh@940 -- # kill -0 172983 00:06:04.997 20:53:20 -- common/autotest_common.sh@941 -- # uname 00:06:04.997 20:53:20 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:04.997 20:53:20 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 172983 00:06:04.997 20:53:20 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:04.997 20:53:20 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:04.997 20:53:20 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 172983' 00:06:04.997 killing process with pid 172983 00:06:04.997 20:53:20 -- common/autotest_common.sh@955 -- # kill 172983 00:06:04.997 20:53:20 -- common/autotest_common.sh@960 -- # wait 172983 00:06:05.257 20:53:20 -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:05.257 20:53:20 -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:05.257 00:06:05.257 real 0m6.167s 00:06:05.257 user 0m5.849s 00:06:05.257 sys 0m0.563s 00:06:05.257 20:53:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:05.257 20:53:20 -- common/autotest_common.sh@10 -- # set +x 00:06:05.257 ************************************ 00:06:05.257 END TEST skip_rpc_with_json 00:06:05.257 ************************************ 00:06:05.257 20:53:20 -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:05.257 20:53:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:05.257 20:53:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.257 20:53:20 -- common/autotest_common.sh@10 -- # set +x 00:06:05.257 ************************************ 00:06:05.257 START TEST skip_rpc_with_delay 00:06:05.257 ************************************ 00:06:05.257 20:53:20 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_delay 00:06:05.257 20:53:20 -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:05.257 20:53:20 -- common/autotest_common.sh@638 -- # local es=0 00:06:05.257 20:53:20 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:05.257 20:53:20 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.257 20:53:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:05.257 20:53:20 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.257 20:53:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:05.257 20:53:20 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.257 20:53:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:05.257 20:53:20 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:05.257 20:53:20 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:05.257 20:53:20 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:05.257 [2024-04-25 20:53:20.908906] app.c: 751:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:05.257 [2024-04-25 20:53:20.909051] app.c: 630:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:05.516 20:53:20 -- common/autotest_common.sh@641 -- # es=1 00:06:05.516 20:53:20 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:05.516 20:53:20 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:05.516 20:53:20 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:05.516 00:06:05.516 real 0m0.043s 00:06:05.516 user 0m0.019s 00:06:05.516 sys 0m0.025s 00:06:05.516 20:53:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:05.516 20:53:20 -- common/autotest_common.sh@10 -- # set +x 00:06:05.516 ************************************ 00:06:05.516 END TEST skip_rpc_with_delay 00:06:05.516 ************************************ 00:06:05.516 20:53:20 -- rpc/skip_rpc.sh@77 -- # uname 00:06:05.516 20:53:20 -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:05.516 20:53:20 -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:05.516 20:53:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:05.516 20:53:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.516 20:53:20 -- common/autotest_common.sh@10 -- # set +x 00:06:05.516 ************************************ 00:06:05.516 START TEST exit_on_failed_rpc_init 00:06:05.516 ************************************ 00:06:05.516 20:53:21 -- common/autotest_common.sh@1111 -- # test_exit_on_failed_rpc_init 00:06:05.516 20:53:21 -- rpc/skip_rpc.sh@62 -- # local spdk_pid=174107 00:06:05.516 20:53:21 -- rpc/skip_rpc.sh@63 -- # waitforlisten 174107 00:06:05.516 20:53:21 -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:05.516 20:53:21 -- common/autotest_common.sh@817 -- # '[' -z 174107 ']' 00:06:05.516 20:53:21 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:05.516 20:53:21 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:05.516 20:53:21 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:05.516 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:05.516 20:53:21 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:05.516 20:53:21 -- common/autotest_common.sh@10 -- # set +x 00:06:05.516 [2024-04-25 20:53:21.154708] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:05.516 [2024-04-25 20:53:21.154764] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid174107 ] 00:06:05.775 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.775 [2024-04-25 20:53:21.190472] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:05.775 [2024-04-25 20:53:21.221123] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.775 [2024-04-25 20:53:21.257328] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.775 20:53:21 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:05.775 20:53:21 -- common/autotest_common.sh@850 -- # return 0 00:06:05.775 20:53:21 -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:05.775 20:53:21 -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:05.775 20:53:21 -- common/autotest_common.sh@638 -- # local es=0 00:06:05.775 20:53:21 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:05.775 20:53:21 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:06.034 20:53:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:06.034 20:53:21 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:06.035 20:53:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:06.035 20:53:21 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:06.035 20:53:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:06.035 20:53:21 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:06.035 20:53:21 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:06.035 20:53:21 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:06.035 [2024-04-25 20:53:21.465815] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:06.035 [2024-04-25 20:53:21.465896] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid174119 ] 00:06:06.035 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.035 [2024-04-25 20:53:21.501524] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:06.035 [2024-04-25 20:53:21.533247] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.035 [2024-04-25 20:53:21.569594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.035 [2024-04-25 20:53:21.569673] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:06.035 [2024-04-25 20:53:21.569686] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:06.035 [2024-04-25 20:53:21.569694] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:06.035 20:53:21 -- common/autotest_common.sh@641 -- # es=234 00:06:06.035 20:53:21 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:06.035 20:53:21 -- common/autotest_common.sh@650 -- # es=106 00:06:06.035 20:53:21 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:06.035 20:53:21 -- common/autotest_common.sh@658 -- # es=1 00:06:06.035 20:53:21 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:06.035 20:53:21 -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:06.035 20:53:21 -- rpc/skip_rpc.sh@70 -- # killprocess 174107 00:06:06.035 20:53:21 -- common/autotest_common.sh@936 -- # '[' -z 174107 ']' 00:06:06.035 20:53:21 -- common/autotest_common.sh@940 -- # kill -0 174107 00:06:06.035 20:53:21 -- common/autotest_common.sh@941 -- # uname 00:06:06.035 20:53:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:06.035 20:53:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 174107 00:06:06.035 20:53:21 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:06.035 20:53:21 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:06.035 20:53:21 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 174107' 00:06:06.035 killing process with pid 174107 00:06:06.035 20:53:21 -- common/autotest_common.sh@955 -- # kill 174107 00:06:06.035 20:53:21 -- common/autotest_common.sh@960 -- # wait 174107 00:06:06.603 00:06:06.603 real 0m0.835s 00:06:06.603 user 0m0.835s 00:06:06.603 sys 0m0.404s 00:06:06.603 20:53:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:06.603 20:53:21 -- common/autotest_common.sh@10 -- # set +x 00:06:06.603 ************************************ 00:06:06.603 END TEST exit_on_failed_rpc_init 00:06:06.603 ************************************ 00:06:06.603 20:53:22 -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:06.603 00:06:06.603 real 0m13.292s 00:06:06.603 user 0m12.104s 00:06:06.603 sys 0m1.805s 00:06:06.603 20:53:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:06.603 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:06.603 ************************************ 00:06:06.603 END TEST skip_rpc 00:06:06.603 ************************************ 00:06:06.603 20:53:22 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:06.603 20:53:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:06.603 20:53:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.603 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:06.603 ************************************ 00:06:06.603 START TEST rpc_client 00:06:06.603 ************************************ 00:06:06.603 20:53:22 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:06.863 * Looking for test storage... 00:06:06.863 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:06.863 20:53:22 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:06.863 OK 00:06:06.863 20:53:22 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:06.863 00:06:06.863 real 0m0.116s 00:06:06.863 user 0m0.037s 00:06:06.863 sys 0m0.086s 00:06:06.863 20:53:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:06.863 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:06.863 ************************************ 00:06:06.863 END TEST rpc_client 00:06:06.863 ************************************ 00:06:06.863 20:53:22 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:06.863 20:53:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:06.863 20:53:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.863 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:06.863 ************************************ 00:06:06.863 START TEST json_config 00:06:06.863 ************************************ 00:06:06.863 20:53:22 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:07.123 20:53:22 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:07.123 20:53:22 -- nvmf/common.sh@7 -- # uname -s 00:06:07.123 20:53:22 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:07.123 20:53:22 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:07.123 20:53:22 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:07.123 20:53:22 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:07.123 20:53:22 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:07.123 20:53:22 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:07.123 20:53:22 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:07.123 20:53:22 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:07.123 20:53:22 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:07.123 20:53:22 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:07.123 20:53:22 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:07.123 20:53:22 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:07.123 20:53:22 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:07.123 20:53:22 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:07.123 20:53:22 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:07.123 20:53:22 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:07.123 20:53:22 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:07.123 20:53:22 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:07.123 20:53:22 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:07.123 20:53:22 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:07.123 20:53:22 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.123 20:53:22 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.123 20:53:22 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.123 20:53:22 -- paths/export.sh@5 -- # export PATH 00:06:07.123 20:53:22 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.123 20:53:22 -- nvmf/common.sh@47 -- # : 0 00:06:07.123 20:53:22 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:07.123 20:53:22 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:07.123 20:53:22 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:07.123 20:53:22 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:07.123 20:53:22 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:07.124 20:53:22 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:07.124 20:53:22 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:07.124 20:53:22 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:07.124 20:53:22 -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:07.124 20:53:22 -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:07.124 20:53:22 -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:07.124 20:53:22 -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:07.124 20:53:22 -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:07.124 20:53:22 -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:07.124 WARNING: No tests are enabled so not running JSON configuration tests 00:06:07.124 20:53:22 -- json_config/json_config.sh@28 -- # exit 0 00:06:07.124 00:06:07.124 real 0m0.114s 00:06:07.124 user 0m0.052s 00:06:07.124 sys 0m0.063s 00:06:07.124 20:53:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:07.124 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:07.124 ************************************ 00:06:07.124 END TEST json_config 00:06:07.124 ************************************ 00:06:07.124 20:53:22 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:07.124 20:53:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:07.124 20:53:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.124 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:07.383 ************************************ 00:06:07.383 START TEST json_config_extra_key 00:06:07.383 ************************************ 00:06:07.383 20:53:22 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:07.383 20:53:22 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:07.383 20:53:22 -- nvmf/common.sh@7 -- # uname -s 00:06:07.383 20:53:22 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:07.383 20:53:22 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:07.383 20:53:22 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:07.383 20:53:22 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:07.383 20:53:22 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:07.383 20:53:22 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:07.383 20:53:22 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:07.383 20:53:22 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:07.383 20:53:22 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:07.383 20:53:22 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:07.383 20:53:22 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:07.383 20:53:22 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:07.383 20:53:22 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:07.383 20:53:22 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:07.383 20:53:22 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:07.383 20:53:22 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:07.383 20:53:22 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:07.383 20:53:22 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:07.383 20:53:22 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:07.383 20:53:22 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:07.383 20:53:22 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.383 20:53:22 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.383 20:53:22 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.383 20:53:22 -- paths/export.sh@5 -- # export PATH 00:06:07.383 20:53:22 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:07.383 20:53:22 -- nvmf/common.sh@47 -- # : 0 00:06:07.384 20:53:22 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:07.384 20:53:22 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:07.384 20:53:22 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:07.384 20:53:22 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:07.384 20:53:22 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:07.384 20:53:22 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:07.384 20:53:22 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:07.384 20:53:22 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:07.384 INFO: launching applications... 00:06:07.384 20:53:22 -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:07.384 20:53:22 -- json_config/common.sh@9 -- # local app=target 00:06:07.384 20:53:22 -- json_config/common.sh@10 -- # shift 00:06:07.384 20:53:22 -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:07.384 20:53:22 -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:07.384 20:53:22 -- json_config/common.sh@15 -- # local app_extra_params= 00:06:07.384 20:53:22 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:07.384 20:53:22 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:07.384 20:53:22 -- json_config/common.sh@22 -- # app_pid["$app"]=174546 00:06:07.384 20:53:22 -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:07.384 Waiting for target to run... 00:06:07.384 20:53:22 -- json_config/common.sh@25 -- # waitforlisten 174546 /var/tmp/spdk_tgt.sock 00:06:07.384 20:53:22 -- common/autotest_common.sh@817 -- # '[' -z 174546 ']' 00:06:07.384 20:53:22 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:07.384 20:53:22 -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:07.384 20:53:22 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:07.384 20:53:22 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:07.384 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:07.384 20:53:22 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:07.384 20:53:22 -- common/autotest_common.sh@10 -- # set +x 00:06:07.384 [2024-04-25 20:53:22.970763] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:07.384 [2024-04-25 20:53:22.970841] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid174546 ] 00:06:07.384 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.950 [2024-04-25 20:53:23.377907] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:07.950 [2024-04-25 20:53:23.409701] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.950 [2024-04-25 20:53:23.437959] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.209 20:53:23 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:08.209 20:53:23 -- common/autotest_common.sh@850 -- # return 0 00:06:08.209 20:53:23 -- json_config/common.sh@26 -- # echo '' 00:06:08.209 00:06:08.209 20:53:23 -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:08.209 INFO: shutting down applications... 00:06:08.209 20:53:23 -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:08.209 20:53:23 -- json_config/common.sh@31 -- # local app=target 00:06:08.209 20:53:23 -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:08.209 20:53:23 -- json_config/common.sh@35 -- # [[ -n 174546 ]] 00:06:08.209 20:53:23 -- json_config/common.sh@38 -- # kill -SIGINT 174546 00:06:08.209 20:53:23 -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:08.209 20:53:23 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:08.209 20:53:23 -- json_config/common.sh@41 -- # kill -0 174546 00:06:08.209 20:53:23 -- json_config/common.sh@45 -- # sleep 0.5 00:06:08.778 20:53:24 -- json_config/common.sh@40 -- # (( i++ )) 00:06:08.778 20:53:24 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:08.778 20:53:24 -- json_config/common.sh@41 -- # kill -0 174546 00:06:08.778 20:53:24 -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:08.778 20:53:24 -- json_config/common.sh@43 -- # break 00:06:08.778 20:53:24 -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:08.778 20:53:24 -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:08.778 SPDK target shutdown done 00:06:08.778 20:53:24 -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:08.778 Success 00:06:08.778 00:06:08.778 real 0m1.445s 00:06:08.778 user 0m1.006s 00:06:08.778 sys 0m0.553s 00:06:08.778 20:53:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:08.778 20:53:24 -- common/autotest_common.sh@10 -- # set +x 00:06:08.778 ************************************ 00:06:08.778 END TEST json_config_extra_key 00:06:08.778 ************************************ 00:06:08.778 20:53:24 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:08.778 20:53:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:08.778 20:53:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.778 20:53:24 -- common/autotest_common.sh@10 -- # set +x 00:06:09.037 ************************************ 00:06:09.037 START TEST alias_rpc 00:06:09.037 ************************************ 00:06:09.037 20:53:24 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:09.037 * Looking for test storage... 00:06:09.037 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:09.037 20:53:24 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:09.037 20:53:24 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=174873 00:06:09.037 20:53:24 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 174873 00:06:09.037 20:53:24 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:09.037 20:53:24 -- common/autotest_common.sh@817 -- # '[' -z 174873 ']' 00:06:09.037 20:53:24 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.037 20:53:24 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:09.037 20:53:24 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.037 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.037 20:53:24 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:09.037 20:53:24 -- common/autotest_common.sh@10 -- # set +x 00:06:09.037 [2024-04-25 20:53:24.604047] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:09.037 [2024-04-25 20:53:24.604110] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid174873 ] 00:06:09.037 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.037 [2024-04-25 20:53:24.641556] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:09.037 [2024-04-25 20:53:24.673107] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.298 [2024-04-25 20:53:24.711359] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.298 20:53:24 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:09.298 20:53:24 -- common/autotest_common.sh@850 -- # return 0 00:06:09.298 20:53:24 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:09.630 20:53:25 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 174873 00:06:09.630 20:53:25 -- common/autotest_common.sh@936 -- # '[' -z 174873 ']' 00:06:09.630 20:53:25 -- common/autotest_common.sh@940 -- # kill -0 174873 00:06:09.630 20:53:25 -- common/autotest_common.sh@941 -- # uname 00:06:09.630 20:53:25 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:09.630 20:53:25 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 174873 00:06:09.630 20:53:25 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:09.630 20:53:25 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:09.630 20:53:25 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 174873' 00:06:09.630 killing process with pid 174873 00:06:09.630 20:53:25 -- common/autotest_common.sh@955 -- # kill 174873 00:06:09.630 20:53:25 -- common/autotest_common.sh@960 -- # wait 174873 00:06:09.914 00:06:09.914 real 0m0.973s 00:06:09.914 user 0m0.951s 00:06:09.914 sys 0m0.410s 00:06:09.914 20:53:25 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:09.914 20:53:25 -- common/autotest_common.sh@10 -- # set +x 00:06:09.914 ************************************ 00:06:09.914 END TEST alias_rpc 00:06:09.914 ************************************ 00:06:09.914 20:53:25 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:06:09.914 20:53:25 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:09.914 20:53:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:09.914 20:53:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:09.914 20:53:25 -- common/autotest_common.sh@10 -- # set +x 00:06:10.174 ************************************ 00:06:10.174 START TEST spdkcli_tcp 00:06:10.174 ************************************ 00:06:10.174 20:53:25 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:10.174 * Looking for test storage... 00:06:10.174 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:10.174 20:53:25 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:10.174 20:53:25 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:10.174 20:53:25 -- common/autotest_common.sh@710 -- # xtrace_disable 00:06:10.174 20:53:25 -- common/autotest_common.sh@10 -- # set +x 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=175200 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@27 -- # waitforlisten 175200 00:06:10.174 20:53:25 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:10.174 20:53:25 -- common/autotest_common.sh@817 -- # '[' -z 175200 ']' 00:06:10.174 20:53:25 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:10.174 20:53:25 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:10.174 20:53:25 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:10.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:10.174 20:53:25 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:10.174 20:53:25 -- common/autotest_common.sh@10 -- # set +x 00:06:10.174 [2024-04-25 20:53:25.769133] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:10.174 [2024-04-25 20:53:25.769194] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid175200 ] 00:06:10.174 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.174 [2024-04-25 20:53:25.805181] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:10.433 [2024-04-25 20:53:25.837540] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:10.433 [2024-04-25 20:53:25.874869] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:10.433 [2024-04-25 20:53:25.874870] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.433 20:53:26 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:10.433 20:53:26 -- common/autotest_common.sh@850 -- # return 0 00:06:10.433 20:53:26 -- spdkcli/tcp.sh@31 -- # socat_pid=175209 00:06:10.433 20:53:26 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:10.433 20:53:26 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:10.692 [ 00:06:10.692 "spdk_get_version", 00:06:10.692 "rpc_get_methods", 00:06:10.692 "trace_get_info", 00:06:10.692 "trace_get_tpoint_group_mask", 00:06:10.692 "trace_disable_tpoint_group", 00:06:10.692 "trace_enable_tpoint_group", 00:06:10.692 "trace_clear_tpoint_mask", 00:06:10.692 "trace_set_tpoint_mask", 00:06:10.692 "vfu_tgt_set_base_path", 00:06:10.692 "framework_get_pci_devices", 00:06:10.692 "framework_get_config", 00:06:10.692 "framework_get_subsystems", 00:06:10.692 "keyring_get_keys", 00:06:10.692 "iobuf_get_stats", 00:06:10.692 "iobuf_set_options", 00:06:10.692 "sock_get_default_impl", 00:06:10.692 "sock_set_default_impl", 00:06:10.692 "sock_impl_set_options", 00:06:10.692 "sock_impl_get_options", 00:06:10.692 "vmd_rescan", 00:06:10.692 "vmd_remove_device", 00:06:10.692 "vmd_enable", 00:06:10.692 "accel_get_stats", 00:06:10.692 "accel_set_options", 00:06:10.692 "accel_set_driver", 00:06:10.692 "accel_crypto_key_destroy", 00:06:10.692 "accel_crypto_keys_get", 00:06:10.692 "accel_crypto_key_create", 00:06:10.692 "accel_assign_opc", 00:06:10.692 "accel_get_module_info", 00:06:10.692 "accel_get_opc_assignments", 00:06:10.692 "notify_get_notifications", 00:06:10.692 "notify_get_types", 00:06:10.692 "bdev_get_histogram", 00:06:10.693 "bdev_enable_histogram", 00:06:10.693 "bdev_set_qos_limit", 00:06:10.693 "bdev_set_qd_sampling_period", 00:06:10.693 "bdev_get_bdevs", 00:06:10.693 "bdev_reset_iostat", 00:06:10.693 "bdev_get_iostat", 00:06:10.693 "bdev_examine", 00:06:10.693 "bdev_wait_for_examine", 00:06:10.693 "bdev_set_options", 00:06:10.693 "scsi_get_devices", 00:06:10.693 "thread_set_cpumask", 00:06:10.693 "framework_get_scheduler", 00:06:10.693 "framework_set_scheduler", 00:06:10.693 "framework_get_reactors", 00:06:10.693 "thread_get_io_channels", 00:06:10.693 "thread_get_pollers", 00:06:10.693 "thread_get_stats", 00:06:10.693 "framework_monitor_context_switch", 00:06:10.693 "spdk_kill_instance", 00:06:10.693 "log_enable_timestamps", 00:06:10.693 "log_get_flags", 00:06:10.693 "log_clear_flag", 00:06:10.693 "log_set_flag", 00:06:10.693 "log_get_level", 00:06:10.693 "log_set_level", 00:06:10.693 "log_get_print_level", 00:06:10.693 "log_set_print_level", 00:06:10.693 "framework_enable_cpumask_locks", 00:06:10.693 "framework_disable_cpumask_locks", 00:06:10.693 "framework_wait_init", 00:06:10.693 "framework_start_init", 00:06:10.693 "virtio_blk_create_transport", 00:06:10.693 "virtio_blk_get_transports", 00:06:10.693 "vhost_controller_set_coalescing", 00:06:10.693 "vhost_get_controllers", 00:06:10.693 "vhost_delete_controller", 00:06:10.693 "vhost_create_blk_controller", 00:06:10.693 "vhost_scsi_controller_remove_target", 00:06:10.693 "vhost_scsi_controller_add_target", 00:06:10.693 "vhost_start_scsi_controller", 00:06:10.693 "vhost_create_scsi_controller", 00:06:10.693 "ublk_recover_disk", 00:06:10.693 "ublk_get_disks", 00:06:10.693 "ublk_stop_disk", 00:06:10.693 "ublk_start_disk", 00:06:10.693 "ublk_destroy_target", 00:06:10.693 "ublk_create_target", 00:06:10.693 "nbd_get_disks", 00:06:10.693 "nbd_stop_disk", 00:06:10.693 "nbd_start_disk", 00:06:10.693 "env_dpdk_get_mem_stats", 00:06:10.693 "nvmf_subsystem_get_listeners", 00:06:10.693 "nvmf_subsystem_get_qpairs", 00:06:10.693 "nvmf_subsystem_get_controllers", 00:06:10.693 "nvmf_get_stats", 00:06:10.693 "nvmf_get_transports", 00:06:10.693 "nvmf_create_transport", 00:06:10.693 "nvmf_get_targets", 00:06:10.693 "nvmf_delete_target", 00:06:10.693 "nvmf_create_target", 00:06:10.693 "nvmf_subsystem_allow_any_host", 00:06:10.693 "nvmf_subsystem_remove_host", 00:06:10.693 "nvmf_subsystem_add_host", 00:06:10.693 "nvmf_ns_remove_host", 00:06:10.693 "nvmf_ns_add_host", 00:06:10.693 "nvmf_subsystem_remove_ns", 00:06:10.693 "nvmf_subsystem_add_ns", 00:06:10.693 "nvmf_subsystem_listener_set_ana_state", 00:06:10.693 "nvmf_discovery_get_referrals", 00:06:10.693 "nvmf_discovery_remove_referral", 00:06:10.693 "nvmf_discovery_add_referral", 00:06:10.693 "nvmf_subsystem_remove_listener", 00:06:10.693 "nvmf_subsystem_add_listener", 00:06:10.693 "nvmf_delete_subsystem", 00:06:10.693 "nvmf_create_subsystem", 00:06:10.693 "nvmf_get_subsystems", 00:06:10.693 "nvmf_set_crdt", 00:06:10.693 "nvmf_set_config", 00:06:10.693 "nvmf_set_max_subsystems", 00:06:10.693 "iscsi_get_histogram", 00:06:10.693 "iscsi_enable_histogram", 00:06:10.693 "iscsi_set_options", 00:06:10.693 "iscsi_get_auth_groups", 00:06:10.693 "iscsi_auth_group_remove_secret", 00:06:10.693 "iscsi_auth_group_add_secret", 00:06:10.693 "iscsi_delete_auth_group", 00:06:10.693 "iscsi_create_auth_group", 00:06:10.693 "iscsi_set_discovery_auth", 00:06:10.693 "iscsi_get_options", 00:06:10.693 "iscsi_target_node_request_logout", 00:06:10.693 "iscsi_target_node_set_redirect", 00:06:10.693 "iscsi_target_node_set_auth", 00:06:10.693 "iscsi_target_node_add_lun", 00:06:10.693 "iscsi_get_stats", 00:06:10.693 "iscsi_get_connections", 00:06:10.693 "iscsi_portal_group_set_auth", 00:06:10.693 "iscsi_start_portal_group", 00:06:10.693 "iscsi_delete_portal_group", 00:06:10.693 "iscsi_create_portal_group", 00:06:10.693 "iscsi_get_portal_groups", 00:06:10.693 "iscsi_delete_target_node", 00:06:10.693 "iscsi_target_node_remove_pg_ig_maps", 00:06:10.693 "iscsi_target_node_add_pg_ig_maps", 00:06:10.693 "iscsi_create_target_node", 00:06:10.693 "iscsi_get_target_nodes", 00:06:10.693 "iscsi_delete_initiator_group", 00:06:10.693 "iscsi_initiator_group_remove_initiators", 00:06:10.693 "iscsi_initiator_group_add_initiators", 00:06:10.693 "iscsi_create_initiator_group", 00:06:10.693 "iscsi_get_initiator_groups", 00:06:10.693 "keyring_file_remove_key", 00:06:10.693 "keyring_file_add_key", 00:06:10.693 "vfu_virtio_create_scsi_endpoint", 00:06:10.693 "vfu_virtio_scsi_remove_target", 00:06:10.693 "vfu_virtio_scsi_add_target", 00:06:10.693 "vfu_virtio_create_blk_endpoint", 00:06:10.693 "vfu_virtio_delete_endpoint", 00:06:10.693 "iaa_scan_accel_module", 00:06:10.693 "dsa_scan_accel_module", 00:06:10.693 "ioat_scan_accel_module", 00:06:10.693 "accel_error_inject_error", 00:06:10.693 "bdev_iscsi_delete", 00:06:10.693 "bdev_iscsi_create", 00:06:10.693 "bdev_iscsi_set_options", 00:06:10.693 "bdev_virtio_attach_controller", 00:06:10.693 "bdev_virtio_scsi_get_devices", 00:06:10.693 "bdev_virtio_detach_controller", 00:06:10.693 "bdev_virtio_blk_set_hotplug", 00:06:10.693 "bdev_ftl_set_property", 00:06:10.693 "bdev_ftl_get_properties", 00:06:10.693 "bdev_ftl_get_stats", 00:06:10.693 "bdev_ftl_unmap", 00:06:10.693 "bdev_ftl_unload", 00:06:10.693 "bdev_ftl_delete", 00:06:10.693 "bdev_ftl_load", 00:06:10.693 "bdev_ftl_create", 00:06:10.693 "bdev_aio_delete", 00:06:10.693 "bdev_aio_rescan", 00:06:10.693 "bdev_aio_create", 00:06:10.693 "blobfs_create", 00:06:10.693 "blobfs_detect", 00:06:10.693 "blobfs_set_cache_size", 00:06:10.693 "bdev_zone_block_delete", 00:06:10.693 "bdev_zone_block_create", 00:06:10.693 "bdev_delay_delete", 00:06:10.693 "bdev_delay_create", 00:06:10.693 "bdev_delay_update_latency", 00:06:10.693 "bdev_split_delete", 00:06:10.693 "bdev_split_create", 00:06:10.693 "bdev_error_inject_error", 00:06:10.693 "bdev_error_delete", 00:06:10.693 "bdev_error_create", 00:06:10.693 "bdev_raid_set_options", 00:06:10.693 "bdev_raid_remove_base_bdev", 00:06:10.693 "bdev_raid_add_base_bdev", 00:06:10.693 "bdev_raid_delete", 00:06:10.693 "bdev_raid_create", 00:06:10.693 "bdev_raid_get_bdevs", 00:06:10.693 "bdev_lvol_grow_lvstore", 00:06:10.693 "bdev_lvol_get_lvols", 00:06:10.693 "bdev_lvol_get_lvstores", 00:06:10.693 "bdev_lvol_delete", 00:06:10.693 "bdev_lvol_set_read_only", 00:06:10.693 "bdev_lvol_resize", 00:06:10.693 "bdev_lvol_decouple_parent", 00:06:10.693 "bdev_lvol_inflate", 00:06:10.693 "bdev_lvol_rename", 00:06:10.693 "bdev_lvol_clone_bdev", 00:06:10.693 "bdev_lvol_clone", 00:06:10.693 "bdev_lvol_snapshot", 00:06:10.693 "bdev_lvol_create", 00:06:10.693 "bdev_lvol_delete_lvstore", 00:06:10.693 "bdev_lvol_rename_lvstore", 00:06:10.693 "bdev_lvol_create_lvstore", 00:06:10.693 "bdev_passthru_delete", 00:06:10.693 "bdev_passthru_create", 00:06:10.693 "bdev_nvme_cuse_unregister", 00:06:10.693 "bdev_nvme_cuse_register", 00:06:10.693 "bdev_opal_new_user", 00:06:10.693 "bdev_opal_set_lock_state", 00:06:10.693 "bdev_opal_delete", 00:06:10.693 "bdev_opal_get_info", 00:06:10.693 "bdev_opal_create", 00:06:10.693 "bdev_nvme_opal_revert", 00:06:10.693 "bdev_nvme_opal_init", 00:06:10.693 "bdev_nvme_send_cmd", 00:06:10.693 "bdev_nvme_get_path_iostat", 00:06:10.693 "bdev_nvme_get_mdns_discovery_info", 00:06:10.693 "bdev_nvme_stop_mdns_discovery", 00:06:10.693 "bdev_nvme_start_mdns_discovery", 00:06:10.693 "bdev_nvme_set_multipath_policy", 00:06:10.693 "bdev_nvme_set_preferred_path", 00:06:10.693 "bdev_nvme_get_io_paths", 00:06:10.693 "bdev_nvme_remove_error_injection", 00:06:10.693 "bdev_nvme_add_error_injection", 00:06:10.693 "bdev_nvme_get_discovery_info", 00:06:10.693 "bdev_nvme_stop_discovery", 00:06:10.693 "bdev_nvme_start_discovery", 00:06:10.693 "bdev_nvme_get_controller_health_info", 00:06:10.693 "bdev_nvme_disable_controller", 00:06:10.693 "bdev_nvme_enable_controller", 00:06:10.693 "bdev_nvme_reset_controller", 00:06:10.693 "bdev_nvme_get_transport_statistics", 00:06:10.693 "bdev_nvme_apply_firmware", 00:06:10.693 "bdev_nvme_detach_controller", 00:06:10.693 "bdev_nvme_get_controllers", 00:06:10.693 "bdev_nvme_attach_controller", 00:06:10.693 "bdev_nvme_set_hotplug", 00:06:10.693 "bdev_nvme_set_options", 00:06:10.693 "bdev_null_resize", 00:06:10.693 "bdev_null_delete", 00:06:10.693 "bdev_null_create", 00:06:10.693 "bdev_malloc_delete", 00:06:10.693 "bdev_malloc_create" 00:06:10.693 ] 00:06:10.693 20:53:26 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:10.693 20:53:26 -- common/autotest_common.sh@716 -- # xtrace_disable 00:06:10.693 20:53:26 -- common/autotest_common.sh@10 -- # set +x 00:06:10.693 20:53:26 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:10.693 20:53:26 -- spdkcli/tcp.sh@38 -- # killprocess 175200 00:06:10.693 20:53:26 -- common/autotest_common.sh@936 -- # '[' -z 175200 ']' 00:06:10.693 20:53:26 -- common/autotest_common.sh@940 -- # kill -0 175200 00:06:10.693 20:53:26 -- common/autotest_common.sh@941 -- # uname 00:06:10.693 20:53:26 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:10.693 20:53:26 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 175200 00:06:10.693 20:53:26 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:10.693 20:53:26 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:10.693 20:53:26 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 175200' 00:06:10.693 killing process with pid 175200 00:06:10.693 20:53:26 -- common/autotest_common.sh@955 -- # kill 175200 00:06:10.693 20:53:26 -- common/autotest_common.sh@960 -- # wait 175200 00:06:10.952 00:06:10.952 real 0m0.972s 00:06:10.952 user 0m1.617s 00:06:10.952 sys 0m0.437s 00:06:10.952 20:53:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:10.952 20:53:26 -- common/autotest_common.sh@10 -- # set +x 00:06:10.952 ************************************ 00:06:10.952 END TEST spdkcli_tcp 00:06:10.952 ************************************ 00:06:11.210 20:53:26 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:11.210 20:53:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:11.210 20:53:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:11.210 20:53:26 -- common/autotest_common.sh@10 -- # set +x 00:06:11.210 ************************************ 00:06:11.210 START TEST dpdk_mem_utility 00:06:11.210 ************************************ 00:06:11.210 20:53:26 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:11.210 * Looking for test storage... 00:06:11.210 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:11.210 20:53:26 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:11.210 20:53:26 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=175454 00:06:11.210 20:53:26 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 175454 00:06:11.210 20:53:26 -- common/autotest_common.sh@817 -- # '[' -z 175454 ']' 00:06:11.210 20:53:26 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.210 20:53:26 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:11.210 20:53:26 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.210 20:53:26 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:11.210 20:53:26 -- common/autotest_common.sh@10 -- # set +x 00:06:11.210 20:53:26 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:11.468 [2024-04-25 20:53:26.888347] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:11.468 [2024-04-25 20:53:26.888430] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid175454 ] 00:06:11.468 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.468 [2024-04-25 20:53:26.925485] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:11.468 [2024-04-25 20:53:26.957324] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.468 [2024-04-25 20:53:26.995396] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.727 20:53:27 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:11.727 20:53:27 -- common/autotest_common.sh@850 -- # return 0 00:06:11.727 20:53:27 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:11.727 20:53:27 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:11.727 20:53:27 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:11.727 20:53:27 -- common/autotest_common.sh@10 -- # set +x 00:06:11.727 { 00:06:11.727 "filename": "/tmp/spdk_mem_dump.txt" 00:06:11.727 } 00:06:11.727 20:53:27 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:11.727 20:53:27 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:11.727 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:11.727 1 heaps totaling size 814.000000 MiB 00:06:11.727 size: 814.000000 MiB heap id: 0 00:06:11.727 end heaps---------- 00:06:11.727 8 mempools totaling size 598.116089 MiB 00:06:11.727 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:11.727 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:11.727 size: 84.521057 MiB name: bdev_io_175454 00:06:11.727 size: 51.011292 MiB name: evtpool_175454 00:06:11.727 size: 50.003479 MiB name: msgpool_175454 00:06:11.727 size: 21.763794 MiB name: PDU_Pool 00:06:11.727 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:11.727 size: 0.026123 MiB name: Session_Pool 00:06:11.727 end mempools------- 00:06:11.727 6 memzones totaling size 4.142822 MiB 00:06:11.727 size: 1.000366 MiB name: RG_ring_0_175454 00:06:11.727 size: 1.000366 MiB name: RG_ring_1_175454 00:06:11.727 size: 1.000366 MiB name: RG_ring_4_175454 00:06:11.727 size: 1.000366 MiB name: RG_ring_5_175454 00:06:11.727 size: 0.125366 MiB name: RG_ring_2_175454 00:06:11.727 size: 0.015991 MiB name: RG_ring_3_175454 00:06:11.727 end memzones------- 00:06:11.727 20:53:27 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:11.727 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:06:11.727 list of free elements. size: 12.519348 MiB 00:06:11.727 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:11.727 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:11.727 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:11.727 element at address: 0x200003e00000 with size: 0.996277 MiB 00:06:11.727 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:11.727 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:11.727 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:11.727 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:11.727 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:11.727 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:06:11.727 element at address: 0x20000b200000 with size: 0.490723 MiB 00:06:11.727 element at address: 0x200000800000 with size: 0.487793 MiB 00:06:11.727 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:11.727 element at address: 0x200027e00000 with size: 0.410034 MiB 00:06:11.727 element at address: 0x200003a00000 with size: 0.355530 MiB 00:06:11.727 list of standard malloc elements. size: 199.218079 MiB 00:06:11.727 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:11.727 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:11.727 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:11.727 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:11.727 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:11.727 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:11.727 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:11.727 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:11.727 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:11.727 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:11.727 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:11.727 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200027e69040 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:11.727 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:11.727 list of memzone associated elements. size: 602.262573 MiB 00:06:11.727 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:11.727 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:11.727 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:11.727 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:11.727 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:11.727 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_175454_0 00:06:11.727 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:11.727 associated memzone info: size: 48.002930 MiB name: MP_evtpool_175454_0 00:06:11.727 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:11.727 associated memzone info: size: 48.002930 MiB name: MP_msgpool_175454_0 00:06:11.727 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:11.727 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:11.727 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:11.728 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:11.728 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:11.728 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_175454 00:06:11.728 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:11.728 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_175454 00:06:11.728 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:11.728 associated memzone info: size: 1.007996 MiB name: MP_evtpool_175454 00:06:11.728 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:11.728 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:11.728 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:11.728 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:11.728 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:11.728 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:11.728 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:11.728 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:11.728 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:11.728 associated memzone info: size: 1.000366 MiB name: RG_ring_0_175454 00:06:11.728 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:11.728 associated memzone info: size: 1.000366 MiB name: RG_ring_1_175454 00:06:11.728 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:11.728 associated memzone info: size: 1.000366 MiB name: RG_ring_4_175454 00:06:11.728 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:11.728 associated memzone info: size: 1.000366 MiB name: RG_ring_5_175454 00:06:11.728 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:11.728 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_175454 00:06:11.728 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:11.728 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:11.728 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:11.728 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:11.728 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:11.728 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:11.728 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:11.728 associated memzone info: size: 0.125366 MiB name: RG_ring_2_175454 00:06:11.728 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:11.728 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:11.728 element at address: 0x200027e69100 with size: 0.023743 MiB 00:06:11.728 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:11.728 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:11.728 associated memzone info: size: 0.015991 MiB name: RG_ring_3_175454 00:06:11.728 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:06:11.728 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:11.728 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:11.728 associated memzone info: size: 0.000183 MiB name: MP_msgpool_175454 00:06:11.728 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:11.728 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_175454 00:06:11.728 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:06:11.728 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:11.728 20:53:27 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:11.728 20:53:27 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 175454 00:06:11.728 20:53:27 -- common/autotest_common.sh@936 -- # '[' -z 175454 ']' 00:06:11.728 20:53:27 -- common/autotest_common.sh@940 -- # kill -0 175454 00:06:11.728 20:53:27 -- common/autotest_common.sh@941 -- # uname 00:06:11.728 20:53:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:11.728 20:53:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 175454 00:06:11.728 20:53:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:11.728 20:53:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:11.728 20:53:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 175454' 00:06:11.728 killing process with pid 175454 00:06:11.728 20:53:27 -- common/autotest_common.sh@955 -- # kill 175454 00:06:11.728 20:53:27 -- common/autotest_common.sh@960 -- # wait 175454 00:06:11.987 00:06:11.987 real 0m0.848s 00:06:11.987 user 0m0.788s 00:06:11.987 sys 0m0.377s 00:06:11.987 20:53:27 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:11.987 20:53:27 -- common/autotest_common.sh@10 -- # set +x 00:06:11.987 ************************************ 00:06:11.987 END TEST dpdk_mem_utility 00:06:11.987 ************************************ 00:06:12.245 20:53:27 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:12.245 20:53:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:12.245 20:53:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.245 20:53:27 -- common/autotest_common.sh@10 -- # set +x 00:06:12.245 ************************************ 00:06:12.245 START TEST event 00:06:12.245 ************************************ 00:06:12.245 20:53:27 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:12.503 * Looking for test storage... 00:06:12.503 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:12.503 20:53:27 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:12.503 20:53:27 -- bdev/nbd_common.sh@6 -- # set -e 00:06:12.503 20:53:27 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:12.503 20:53:27 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:12.503 20:53:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.503 20:53:27 -- common/autotest_common.sh@10 -- # set +x 00:06:12.503 ************************************ 00:06:12.503 START TEST event_perf 00:06:12.503 ************************************ 00:06:12.503 20:53:28 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:12.503 Running I/O for 1 seconds...[2024-04-25 20:53:28.097170] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:12.503 [2024-04-25 20:53:28.097250] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid175652 ] 00:06:12.503 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.503 [2024-04-25 20:53:28.136707] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:12.760 [2024-04-25 20:53:28.169638] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:12.760 [2024-04-25 20:53:28.208809] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:12.760 [2024-04-25 20:53:28.208906] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:12.760 [2024-04-25 20:53:28.208979] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:12.760 [2024-04-25 20:53:28.208980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.694 Running I/O for 1 seconds... 00:06:13.694 lcore 0: 192367 00:06:13.694 lcore 1: 192363 00:06:13.694 lcore 2: 192363 00:06:13.694 lcore 3: 192364 00:06:13.694 done. 00:06:13.694 00:06:13.694 real 0m1.183s 00:06:13.694 user 0m4.084s 00:06:13.694 sys 0m0.095s 00:06:13.694 20:53:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:13.694 20:53:29 -- common/autotest_common.sh@10 -- # set +x 00:06:13.694 ************************************ 00:06:13.694 END TEST event_perf 00:06:13.694 ************************************ 00:06:13.694 20:53:29 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:13.694 20:53:29 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:13.694 20:53:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:13.694 20:53:29 -- common/autotest_common.sh@10 -- # set +x 00:06:13.952 ************************************ 00:06:13.952 START TEST event_reactor 00:06:13.952 ************************************ 00:06:13.952 20:53:29 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:13.952 [2024-04-25 20:53:29.486208] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:13.952 [2024-04-25 20:53:29.486309] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid175941 ] 00:06:13.952 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.952 [2024-04-25 20:53:29.526067] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:13.952 [2024-04-25 20:53:29.557497] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.952 [2024-04-25 20:53:29.594182] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.325 test_start 00:06:15.325 oneshot 00:06:15.325 tick 100 00:06:15.325 tick 100 00:06:15.325 tick 250 00:06:15.325 tick 100 00:06:15.325 tick 100 00:06:15.325 tick 100 00:06:15.325 tick 250 00:06:15.325 tick 500 00:06:15.325 tick 100 00:06:15.325 tick 100 00:06:15.325 tick 250 00:06:15.325 tick 100 00:06:15.325 tick 100 00:06:15.325 test_end 00:06:15.325 00:06:15.325 real 0m1.182s 00:06:15.325 user 0m1.090s 00:06:15.325 sys 0m0.087s 00:06:15.325 20:53:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:15.325 20:53:30 -- common/autotest_common.sh@10 -- # set +x 00:06:15.325 ************************************ 00:06:15.325 END TEST event_reactor 00:06:15.325 ************************************ 00:06:15.325 20:53:30 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:15.325 20:53:30 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:15.325 20:53:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:15.325 20:53:30 -- common/autotest_common.sh@10 -- # set +x 00:06:15.325 ************************************ 00:06:15.325 START TEST event_reactor_perf 00:06:15.325 ************************************ 00:06:15.325 20:53:30 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:15.325 [2024-04-25 20:53:30.868623] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:15.325 [2024-04-25 20:53:30.868708] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid176228 ] 00:06:15.325 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.325 [2024-04-25 20:53:30.907115] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:15.325 [2024-04-25 20:53:30.938484] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.325 [2024-04-25 20:53:30.973854] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.693 test_start 00:06:16.693 test_end 00:06:16.693 Performance: 947416 events per second 00:06:16.693 00:06:16.693 real 0m1.172s 00:06:16.693 user 0m1.081s 00:06:16.693 sys 0m0.086s 00:06:16.693 20:53:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:16.693 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:16.693 ************************************ 00:06:16.693 END TEST event_reactor_perf 00:06:16.693 ************************************ 00:06:16.693 20:53:32 -- event/event.sh@49 -- # uname -s 00:06:16.693 20:53:32 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:16.693 20:53:32 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:16.693 20:53:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:16.693 20:53:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.693 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:16.693 ************************************ 00:06:16.693 START TEST event_scheduler 00:06:16.693 ************************************ 00:06:16.693 20:53:32 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:16.693 * Looking for test storage... 00:06:16.693 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:16.693 20:53:32 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:16.693 20:53:32 -- scheduler/scheduler.sh@35 -- # scheduler_pid=176546 00:06:16.693 20:53:32 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:16.693 20:53:32 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:16.693 20:53:32 -- scheduler/scheduler.sh@37 -- # waitforlisten 176546 00:06:16.693 20:53:32 -- common/autotest_common.sh@817 -- # '[' -z 176546 ']' 00:06:16.693 20:53:32 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.693 20:53:32 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:16.693 20:53:32 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.693 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.693 20:53:32 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:16.693 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:16.693 [2024-04-25 20:53:32.330490] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:16.693 [2024-04-25 20:53:32.330568] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid176546 ] 00:06:16.951 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.951 [2024-04-25 20:53:32.369271] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:16.951 [2024-04-25 20:53:32.397324] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:16.951 [2024-04-25 20:53:32.436779] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.951 [2024-04-25 20:53:32.436801] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:16.951 [2024-04-25 20:53:32.436890] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:16.951 [2024-04-25 20:53:32.436892] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:16.951 20:53:32 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:16.951 20:53:32 -- common/autotest_common.sh@850 -- # return 0 00:06:16.951 20:53:32 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:16.951 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:16.951 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:16.951 POWER: Env isn't set yet! 00:06:16.951 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:16.951 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:16.951 POWER: Cannot set governor of lcore 0 to userspace 00:06:16.951 POWER: Attempting to initialise PSTAT power management... 00:06:16.951 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:16.951 POWER: Initialized successfully for lcore 0 power management 00:06:16.951 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:16.951 POWER: Initialized successfully for lcore 1 power management 00:06:16.951 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:16.951 POWER: Initialized successfully for lcore 2 power management 00:06:16.951 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:16.951 POWER: Initialized successfully for lcore 3 power management 00:06:16.951 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:16.951 20:53:32 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:16.951 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:16.951 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:16.951 [2024-04-25 20:53:32.612476] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:16.951 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:16.951 20:53:32 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:16.951 20:53:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:16.951 20:53:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.951 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 ************************************ 00:06:17.209 START TEST scheduler_create_thread 00:06:17.209 ************************************ 00:06:17.209 20:53:32 -- common/autotest_common.sh@1111 -- # scheduler_create_thread 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 2 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 3 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 4 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 5 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 6 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 7 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 8 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 9 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 10 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.209 20:53:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:17.209 20:53:32 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:17.209 20:53:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.209 20:53:32 -- common/autotest_common.sh@10 -- # set +x 00:06:17.774 20:53:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:17.774 20:53:33 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:17.774 20:53:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:17.774 20:53:33 -- common/autotest_common.sh@10 -- # set +x 00:06:19.146 20:53:34 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:19.146 20:53:34 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:19.146 20:53:34 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:19.146 20:53:34 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:19.146 20:53:34 -- common/autotest_common.sh@10 -- # set +x 00:06:20.520 20:53:35 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:20.520 00:06:20.520 real 0m3.101s 00:06:20.520 user 0m0.018s 00:06:20.520 sys 0m0.010s 00:06:20.520 20:53:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:20.520 20:53:35 -- common/autotest_common.sh@10 -- # set +x 00:06:20.520 ************************************ 00:06:20.520 END TEST scheduler_create_thread 00:06:20.520 ************************************ 00:06:20.520 20:53:35 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:20.520 20:53:35 -- scheduler/scheduler.sh@46 -- # killprocess 176546 00:06:20.520 20:53:35 -- common/autotest_common.sh@936 -- # '[' -z 176546 ']' 00:06:20.520 20:53:35 -- common/autotest_common.sh@940 -- # kill -0 176546 00:06:20.520 20:53:35 -- common/autotest_common.sh@941 -- # uname 00:06:20.520 20:53:35 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:20.520 20:53:35 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 176546 00:06:20.520 20:53:35 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:20.520 20:53:35 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:20.520 20:53:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 176546' 00:06:20.520 killing process with pid 176546 00:06:20.520 20:53:35 -- common/autotest_common.sh@955 -- # kill 176546 00:06:20.520 20:53:35 -- common/autotest_common.sh@960 -- # wait 176546 00:06:20.779 [2024-04-25 20:53:36.247901] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:20.779 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:20.779 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:20.779 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:20.779 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:20.779 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:20.779 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:20.779 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:20.779 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:21.037 00:06:21.037 real 0m4.225s 00:06:21.037 user 0m6.893s 00:06:21.037 sys 0m0.497s 00:06:21.037 20:53:36 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:21.037 20:53:36 -- common/autotest_common.sh@10 -- # set +x 00:06:21.037 ************************************ 00:06:21.037 END TEST event_scheduler 00:06:21.037 ************************************ 00:06:21.037 20:53:36 -- event/event.sh@51 -- # modprobe -n nbd 00:06:21.037 20:53:36 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:21.037 20:53:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:21.037 20:53:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.037 20:53:36 -- common/autotest_common.sh@10 -- # set +x 00:06:21.037 ************************************ 00:06:21.037 START TEST app_repeat 00:06:21.037 ************************************ 00:06:21.037 20:53:36 -- common/autotest_common.sh@1111 -- # app_repeat_test 00:06:21.037 20:53:36 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.037 20:53:36 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.037 20:53:36 -- event/event.sh@13 -- # local nbd_list 00:06:21.037 20:53:36 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.037 20:53:36 -- event/event.sh@14 -- # local bdev_list 00:06:21.037 20:53:36 -- event/event.sh@15 -- # local repeat_times=4 00:06:21.037 20:53:36 -- event/event.sh@17 -- # modprobe nbd 00:06:21.037 20:53:36 -- event/event.sh@19 -- # repeat_pid=177406 00:06:21.037 20:53:36 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:21.038 20:53:36 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:21.038 20:53:36 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 177406' 00:06:21.038 Process app_repeat pid: 177406 00:06:21.038 20:53:36 -- event/event.sh@23 -- # for i in {0..2} 00:06:21.038 20:53:36 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:21.038 spdk_app_start Round 0 00:06:21.038 20:53:36 -- event/event.sh@25 -- # waitforlisten 177406 /var/tmp/spdk-nbd.sock 00:06:21.038 20:53:36 -- common/autotest_common.sh@817 -- # '[' -z 177406 ']' 00:06:21.038 20:53:36 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:21.038 20:53:36 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:21.038 20:53:36 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:21.038 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:21.038 20:53:36 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:21.038 20:53:36 -- common/autotest_common.sh@10 -- # set +x 00:06:21.038 [2024-04-25 20:53:36.681294] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:21.038 [2024-04-25 20:53:36.681378] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid177406 ] 00:06:21.296 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.296 [2024-04-25 20:53:36.722769] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:21.296 [2024-04-25 20:53:36.756283] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:21.296 [2024-04-25 20:53:36.798363] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.296 [2024-04-25 20:53:36.798366] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.296 20:53:36 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:21.296 20:53:36 -- common/autotest_common.sh@850 -- # return 0 00:06:21.296 20:53:36 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:21.554 Malloc0 00:06:21.554 20:53:37 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:21.813 Malloc1 00:06:21.813 20:53:37 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@12 -- # local i 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:21.813 /dev/nbd0 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:21.813 20:53:37 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:21.813 20:53:37 -- common/autotest_common.sh@855 -- # local i 00:06:21.813 20:53:37 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:21.813 20:53:37 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:21.813 20:53:37 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:21.813 20:53:37 -- common/autotest_common.sh@859 -- # break 00:06:21.813 20:53:37 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:21.813 20:53:37 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:21.813 20:53:37 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:21.813 1+0 records in 00:06:21.813 1+0 records out 00:06:21.813 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227664 s, 18.0 MB/s 00:06:21.813 20:53:37 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:21.813 20:53:37 -- common/autotest_common.sh@872 -- # size=4096 00:06:21.813 20:53:37 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:21.813 20:53:37 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:21.813 20:53:37 -- common/autotest_common.sh@875 -- # return 0 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.813 20:53:37 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:22.071 /dev/nbd1 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:22.071 20:53:37 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:22.071 20:53:37 -- common/autotest_common.sh@855 -- # local i 00:06:22.071 20:53:37 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:22.071 20:53:37 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:22.071 20:53:37 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:22.071 20:53:37 -- common/autotest_common.sh@859 -- # break 00:06:22.071 20:53:37 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:22.071 20:53:37 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:22.071 20:53:37 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:22.071 1+0 records in 00:06:22.071 1+0 records out 00:06:22.071 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232265 s, 17.6 MB/s 00:06:22.071 20:53:37 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:22.071 20:53:37 -- common/autotest_common.sh@872 -- # size=4096 00:06:22.071 20:53:37 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:22.071 20:53:37 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:22.071 20:53:37 -- common/autotest_common.sh@875 -- # return 0 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.071 20:53:37 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:22.329 { 00:06:22.329 "nbd_device": "/dev/nbd0", 00:06:22.329 "bdev_name": "Malloc0" 00:06:22.329 }, 00:06:22.329 { 00:06:22.329 "nbd_device": "/dev/nbd1", 00:06:22.329 "bdev_name": "Malloc1" 00:06:22.329 } 00:06:22.329 ]' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:22.329 { 00:06:22.329 "nbd_device": "/dev/nbd0", 00:06:22.329 "bdev_name": "Malloc0" 00:06:22.329 }, 00:06:22.329 { 00:06:22.329 "nbd_device": "/dev/nbd1", 00:06:22.329 "bdev_name": "Malloc1" 00:06:22.329 } 00:06:22.329 ]' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:22.329 /dev/nbd1' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:22.329 /dev/nbd1' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@65 -- # count=2 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@95 -- # count=2 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:22.329 256+0 records in 00:06:22.329 256+0 records out 00:06:22.329 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110056 s, 95.3 MB/s 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:22.329 256+0 records in 00:06:22.329 256+0 records out 00:06:22.329 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202544 s, 51.8 MB/s 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:22.329 256+0 records in 00:06:22.329 256+0 records out 00:06:22.329 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0215826 s, 48.6 MB/s 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@51 -- # local i 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.329 20:53:37 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@41 -- # break 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@45 -- # return 0 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.587 20:53:38 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@41 -- # break 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@45 -- # return 0 00:06:22.844 20:53:38 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:22.845 20:53:38 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.845 20:53:38 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@65 -- # true 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@65 -- # count=0 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@104 -- # count=0 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:23.102 20:53:38 -- bdev/nbd_common.sh@109 -- # return 0 00:06:23.102 20:53:38 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:23.361 20:53:38 -- event/event.sh@35 -- # sleep 3 00:06:23.361 [2024-04-25 20:53:38.944220] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:23.361 [2024-04-25 20:53:38.976923] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.361 [2024-04-25 20:53:38.976925] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.361 [2024-04-25 20:53:39.016081] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:23.361 [2024-04-25 20:53:39.016125] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:26.642 20:53:41 -- event/event.sh@23 -- # for i in {0..2} 00:06:26.642 20:53:41 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:26.642 spdk_app_start Round 1 00:06:26.642 20:53:41 -- event/event.sh@25 -- # waitforlisten 177406 /var/tmp/spdk-nbd.sock 00:06:26.642 20:53:41 -- common/autotest_common.sh@817 -- # '[' -z 177406 ']' 00:06:26.642 20:53:41 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:26.642 20:53:41 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:26.642 20:53:41 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:26.642 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:26.642 20:53:41 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:26.642 20:53:41 -- common/autotest_common.sh@10 -- # set +x 00:06:26.642 20:53:41 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:26.642 20:53:41 -- common/autotest_common.sh@850 -- # return 0 00:06:26.642 20:53:41 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:26.642 Malloc0 00:06:26.642 20:53:42 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:26.642 Malloc1 00:06:26.900 20:53:42 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@12 -- # local i 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:26.900 /dev/nbd0 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:26.900 20:53:42 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:26.900 20:53:42 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:26.900 20:53:42 -- common/autotest_common.sh@855 -- # local i 00:06:26.900 20:53:42 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:26.900 20:53:42 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:26.900 20:53:42 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:26.900 20:53:42 -- common/autotest_common.sh@859 -- # break 00:06:26.900 20:53:42 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:26.900 20:53:42 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:26.900 20:53:42 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:26.900 1+0 records in 00:06:26.900 1+0 records out 00:06:26.900 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259173 s, 15.8 MB/s 00:06:26.901 20:53:42 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:26.901 20:53:42 -- common/autotest_common.sh@872 -- # size=4096 00:06:26.901 20:53:42 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:26.901 20:53:42 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:26.901 20:53:42 -- common/autotest_common.sh@875 -- # return 0 00:06:26.901 20:53:42 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:26.901 20:53:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:26.901 20:53:42 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:27.159 /dev/nbd1 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:27.159 20:53:42 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:27.159 20:53:42 -- common/autotest_common.sh@855 -- # local i 00:06:27.159 20:53:42 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:27.159 20:53:42 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:27.159 20:53:42 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:27.159 20:53:42 -- common/autotest_common.sh@859 -- # break 00:06:27.159 20:53:42 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:27.159 20:53:42 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:27.159 20:53:42 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:27.159 1+0 records in 00:06:27.159 1+0 records out 00:06:27.159 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238515 s, 17.2 MB/s 00:06:27.159 20:53:42 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:27.159 20:53:42 -- common/autotest_common.sh@872 -- # size=4096 00:06:27.159 20:53:42 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:27.159 20:53:42 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:27.159 20:53:42 -- common/autotest_common.sh@875 -- # return 0 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.159 20:53:42 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:27.417 20:53:42 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:27.417 { 00:06:27.417 "nbd_device": "/dev/nbd0", 00:06:27.417 "bdev_name": "Malloc0" 00:06:27.417 }, 00:06:27.417 { 00:06:27.417 "nbd_device": "/dev/nbd1", 00:06:27.417 "bdev_name": "Malloc1" 00:06:27.417 } 00:06:27.417 ]' 00:06:27.417 20:53:42 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:27.417 { 00:06:27.417 "nbd_device": "/dev/nbd0", 00:06:27.417 "bdev_name": "Malloc0" 00:06:27.417 }, 00:06:27.417 { 00:06:27.417 "nbd_device": "/dev/nbd1", 00:06:27.417 "bdev_name": "Malloc1" 00:06:27.417 } 00:06:27.417 ]' 00:06:27.417 20:53:42 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:27.417 20:53:42 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:27.417 /dev/nbd1' 00:06:27.417 20:53:42 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:27.417 /dev/nbd1' 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@65 -- # count=2 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@95 -- # count=2 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:27.418 256+0 records in 00:06:27.418 256+0 records out 00:06:27.418 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114565 s, 91.5 MB/s 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:27.418 20:53:42 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:27.418 256+0 records in 00:06:27.418 256+0 records out 00:06:27.418 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203339 s, 51.6 MB/s 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:27.418 256+0 records in 00:06:27.418 256+0 records out 00:06:27.418 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021632 s, 48.5 MB/s 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@51 -- # local i 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:27.418 20:53:43 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@41 -- # break 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@45 -- # return 0 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:27.674 20:53:43 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@41 -- # break 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@45 -- # return 0 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.932 20:53:43 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@65 -- # true 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@65 -- # count=0 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@104 -- # count=0 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:28.190 20:53:43 -- bdev/nbd_common.sh@109 -- # return 0 00:06:28.190 20:53:43 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:28.449 20:53:43 -- event/event.sh@35 -- # sleep 3 00:06:28.449 [2024-04-25 20:53:44.026013] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:28.449 [2024-04-25 20:53:44.058620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:28.449 [2024-04-25 20:53:44.058621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.449 [2024-04-25 20:53:44.098551] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:28.449 [2024-04-25 20:53:44.098600] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:31.730 20:53:46 -- event/event.sh@23 -- # for i in {0..2} 00:06:31.730 20:53:46 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:31.730 spdk_app_start Round 2 00:06:31.730 20:53:46 -- event/event.sh@25 -- # waitforlisten 177406 /var/tmp/spdk-nbd.sock 00:06:31.730 20:53:46 -- common/autotest_common.sh@817 -- # '[' -z 177406 ']' 00:06:31.730 20:53:46 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:31.730 20:53:46 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:31.730 20:53:46 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:31.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:31.730 20:53:46 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:31.730 20:53:46 -- common/autotest_common.sh@10 -- # set +x 00:06:31.730 20:53:47 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:31.730 20:53:47 -- common/autotest_common.sh@850 -- # return 0 00:06:31.730 20:53:47 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.730 Malloc0 00:06:31.730 20:53:47 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.730 Malloc1 00:06:31.988 20:53:47 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@12 -- # local i 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:31.988 /dev/nbd0 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:31.988 20:53:47 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:31.988 20:53:47 -- common/autotest_common.sh@855 -- # local i 00:06:31.988 20:53:47 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:31.988 20:53:47 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:31.988 20:53:47 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:31.988 20:53:47 -- common/autotest_common.sh@859 -- # break 00:06:31.988 20:53:47 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:31.988 20:53:47 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:31.988 20:53:47 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.988 1+0 records in 00:06:31.988 1+0 records out 00:06:31.988 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000178638 s, 22.9 MB/s 00:06:31.988 20:53:47 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:31.988 20:53:47 -- common/autotest_common.sh@872 -- # size=4096 00:06:31.988 20:53:47 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:31.988 20:53:47 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:31.988 20:53:47 -- common/autotest_common.sh@875 -- # return 0 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.988 20:53:47 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:32.246 /dev/nbd1 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:32.246 20:53:47 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:32.246 20:53:47 -- common/autotest_common.sh@855 -- # local i 00:06:32.246 20:53:47 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:32.246 20:53:47 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:32.246 20:53:47 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:32.246 20:53:47 -- common/autotest_common.sh@859 -- # break 00:06:32.246 20:53:47 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:32.246 20:53:47 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:32.246 20:53:47 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:32.246 1+0 records in 00:06:32.246 1+0 records out 00:06:32.246 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241036 s, 17.0 MB/s 00:06:32.246 20:53:47 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:32.246 20:53:47 -- common/autotest_common.sh@872 -- # size=4096 00:06:32.246 20:53:47 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:32.246 20:53:47 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:32.246 20:53:47 -- common/autotest_common.sh@875 -- # return 0 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:32.246 20:53:47 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:32.505 20:53:47 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:32.505 { 00:06:32.505 "nbd_device": "/dev/nbd0", 00:06:32.505 "bdev_name": "Malloc0" 00:06:32.505 }, 00:06:32.505 { 00:06:32.505 "nbd_device": "/dev/nbd1", 00:06:32.505 "bdev_name": "Malloc1" 00:06:32.505 } 00:06:32.505 ]' 00:06:32.505 20:53:47 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:32.505 20:53:47 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:32.505 { 00:06:32.505 "nbd_device": "/dev/nbd0", 00:06:32.505 "bdev_name": "Malloc0" 00:06:32.505 }, 00:06:32.505 { 00:06:32.505 "nbd_device": "/dev/nbd1", 00:06:32.505 "bdev_name": "Malloc1" 00:06:32.505 } 00:06:32.505 ]' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:32.505 /dev/nbd1' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:32.505 /dev/nbd1' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@65 -- # count=2 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@95 -- # count=2 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:32.505 256+0 records in 00:06:32.505 256+0 records out 00:06:32.505 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114209 s, 91.8 MB/s 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:32.505 256+0 records in 00:06:32.505 256+0 records out 00:06:32.505 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202692 s, 51.7 MB/s 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:32.505 256+0 records in 00:06:32.505 256+0 records out 00:06:32.505 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216841 s, 48.4 MB/s 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@51 -- # local i 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:32.505 20:53:48 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@41 -- # break 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@45 -- # return 0 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:32.763 20:53:48 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@41 -- # break 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@45 -- # return 0 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:33.021 20:53:48 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@65 -- # true 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@65 -- # count=0 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@104 -- # count=0 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:33.280 20:53:48 -- bdev/nbd_common.sh@109 -- # return 0 00:06:33.280 20:53:48 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:33.280 20:53:48 -- event/event.sh@35 -- # sleep 3 00:06:33.538 [2024-04-25 20:53:49.076648] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:33.538 [2024-04-25 20:53:49.109328] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:33.539 [2024-04-25 20:53:49.109329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.539 [2024-04-25 20:53:49.147724] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:33.539 [2024-04-25 20:53:49.147766] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:36.821 20:53:51 -- event/event.sh@38 -- # waitforlisten 177406 /var/tmp/spdk-nbd.sock 00:06:36.821 20:53:51 -- common/autotest_common.sh@817 -- # '[' -z 177406 ']' 00:06:36.821 20:53:51 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:36.821 20:53:51 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:36.821 20:53:51 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:36.821 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:36.821 20:53:51 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:36.821 20:53:51 -- common/autotest_common.sh@10 -- # set +x 00:06:36.821 20:53:52 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:36.821 20:53:52 -- common/autotest_common.sh@850 -- # return 0 00:06:36.821 20:53:52 -- event/event.sh@39 -- # killprocess 177406 00:06:36.821 20:53:52 -- common/autotest_common.sh@936 -- # '[' -z 177406 ']' 00:06:36.821 20:53:52 -- common/autotest_common.sh@940 -- # kill -0 177406 00:06:36.821 20:53:52 -- common/autotest_common.sh@941 -- # uname 00:06:36.821 20:53:52 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:36.821 20:53:52 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 177406 00:06:36.821 20:53:52 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:36.821 20:53:52 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:36.821 20:53:52 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 177406' 00:06:36.821 killing process with pid 177406 00:06:36.821 20:53:52 -- common/autotest_common.sh@955 -- # kill 177406 00:06:36.821 20:53:52 -- common/autotest_common.sh@960 -- # wait 177406 00:06:36.821 spdk_app_start is called in Round 0. 00:06:36.821 Shutdown signal received, stop current app iteration 00:06:36.821 Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 reinitialization... 00:06:36.821 spdk_app_start is called in Round 1. 00:06:36.821 Shutdown signal received, stop current app iteration 00:06:36.821 Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 reinitialization... 00:06:36.821 spdk_app_start is called in Round 2. 00:06:36.821 Shutdown signal received, stop current app iteration 00:06:36.821 Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 reinitialization... 00:06:36.821 spdk_app_start is called in Round 3. 00:06:36.821 Shutdown signal received, stop current app iteration 00:06:36.821 20:53:52 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:36.821 20:53:52 -- event/event.sh@42 -- # return 0 00:06:36.821 00:06:36.821 real 0m15.631s 00:06:36.821 user 0m33.226s 00:06:36.821 sys 0m3.104s 00:06:36.821 20:53:52 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:36.821 20:53:52 -- common/autotest_common.sh@10 -- # set +x 00:06:36.821 ************************************ 00:06:36.821 END TEST app_repeat 00:06:36.821 ************************************ 00:06:36.821 20:53:52 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:36.821 20:53:52 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:36.821 20:53:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:36.821 20:53:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.821 20:53:52 -- common/autotest_common.sh@10 -- # set +x 00:06:36.821 ************************************ 00:06:36.821 START TEST cpu_locks 00:06:36.821 ************************************ 00:06:36.821 20:53:52 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:37.079 * Looking for test storage... 00:06:37.079 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:37.079 20:53:52 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:37.079 20:53:52 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:37.079 20:53:52 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:37.079 20:53:52 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:37.079 20:53:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:37.079 20:53:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.079 20:53:52 -- common/autotest_common.sh@10 -- # set +x 00:06:37.079 ************************************ 00:06:37.079 START TEST default_locks 00:06:37.079 ************************************ 00:06:37.079 20:53:52 -- common/autotest_common.sh@1111 -- # default_locks 00:06:37.079 20:53:52 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=180338 00:06:37.079 20:53:52 -- event/cpu_locks.sh@47 -- # waitforlisten 180338 00:06:37.079 20:53:52 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:37.079 20:53:52 -- common/autotest_common.sh@817 -- # '[' -z 180338 ']' 00:06:37.079 20:53:52 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:37.079 20:53:52 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:37.079 20:53:52 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:37.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:37.079 20:53:52 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:37.079 20:53:52 -- common/autotest_common.sh@10 -- # set +x 00:06:37.079 [2024-04-25 20:53:52.707529] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:37.079 [2024-04-25 20:53:52.707607] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid180338 ] 00:06:37.079 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.338 [2024-04-25 20:53:52.744119] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:37.338 [2024-04-25 20:53:52.775752] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.338 [2024-04-25 20:53:52.812482] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.338 20:53:52 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:37.338 20:53:52 -- common/autotest_common.sh@850 -- # return 0 00:06:37.338 20:53:52 -- event/cpu_locks.sh@49 -- # locks_exist 180338 00:06:37.338 20:53:52 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:37.338 20:53:52 -- event/cpu_locks.sh@22 -- # lslocks -p 180338 00:06:38.289 lslocks: write error 00:06:38.289 20:53:53 -- event/cpu_locks.sh@50 -- # killprocess 180338 00:06:38.289 20:53:53 -- common/autotest_common.sh@936 -- # '[' -z 180338 ']' 00:06:38.289 20:53:53 -- common/autotest_common.sh@940 -- # kill -0 180338 00:06:38.289 20:53:53 -- common/autotest_common.sh@941 -- # uname 00:06:38.289 20:53:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:38.289 20:53:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 180338 00:06:38.289 20:53:53 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:38.289 20:53:53 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:38.289 20:53:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 180338' 00:06:38.289 killing process with pid 180338 00:06:38.289 20:53:53 -- common/autotest_common.sh@955 -- # kill 180338 00:06:38.289 20:53:53 -- common/autotest_common.sh@960 -- # wait 180338 00:06:38.547 20:53:54 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 180338 00:06:38.547 20:53:54 -- common/autotest_common.sh@638 -- # local es=0 00:06:38.547 20:53:54 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 180338 00:06:38.547 20:53:54 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:38.547 20:53:54 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:38.547 20:53:54 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:38.547 20:53:54 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:38.547 20:53:54 -- common/autotest_common.sh@641 -- # waitforlisten 180338 00:06:38.547 20:53:54 -- common/autotest_common.sh@817 -- # '[' -z 180338 ']' 00:06:38.547 20:53:54 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:38.547 20:53:54 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:38.547 20:53:54 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:38.547 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:38.547 20:53:54 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:38.547 20:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.548 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (180338) - No such process 00:06:38.548 ERROR: process (pid: 180338) is no longer running 00:06:38.548 20:53:54 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:38.548 20:53:54 -- common/autotest_common.sh@850 -- # return 1 00:06:38.548 20:53:54 -- common/autotest_common.sh@641 -- # es=1 00:06:38.548 20:53:54 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:38.548 20:53:54 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:38.548 20:53:54 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:38.548 20:53:54 -- event/cpu_locks.sh@54 -- # no_locks 00:06:38.548 20:53:54 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:38.548 20:53:54 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:38.548 20:53:54 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:38.548 00:06:38.548 real 0m1.421s 00:06:38.548 user 0m1.387s 00:06:38.548 sys 0m0.690s 00:06:38.548 20:53:54 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:38.548 20:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.548 ************************************ 00:06:38.548 END TEST default_locks 00:06:38.548 ************************************ 00:06:38.548 20:53:54 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:38.548 20:53:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:38.548 20:53:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.548 20:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.807 ************************************ 00:06:38.807 START TEST default_locks_via_rpc 00:06:38.807 ************************************ 00:06:38.807 20:53:54 -- common/autotest_common.sh@1111 -- # default_locks_via_rpc 00:06:38.807 20:53:54 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=180710 00:06:38.807 20:53:54 -- event/cpu_locks.sh@63 -- # waitforlisten 180710 00:06:38.807 20:53:54 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:38.807 20:53:54 -- common/autotest_common.sh@817 -- # '[' -z 180710 ']' 00:06:38.807 20:53:54 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:38.807 20:53:54 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:38.807 20:53:54 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:38.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:38.807 20:53:54 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:38.807 20:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.807 [2024-04-25 20:53:54.322169] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:38.807 [2024-04-25 20:53:54.322249] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid180710 ] 00:06:38.807 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.807 [2024-04-25 20:53:54.358401] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:38.807 [2024-04-25 20:53:54.390034] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.807 [2024-04-25 20:53:54.427779] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.066 20:53:54 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:39.066 20:53:54 -- common/autotest_common.sh@850 -- # return 0 00:06:39.066 20:53:54 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:39.066 20:53:54 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:39.066 20:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:39.066 20:53:54 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:39.066 20:53:54 -- event/cpu_locks.sh@67 -- # no_locks 00:06:39.066 20:53:54 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:39.066 20:53:54 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:39.066 20:53:54 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:39.066 20:53:54 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:39.066 20:53:54 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:39.066 20:53:54 -- common/autotest_common.sh@10 -- # set +x 00:06:39.066 20:53:54 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:39.066 20:53:54 -- event/cpu_locks.sh@71 -- # locks_exist 180710 00:06:39.066 20:53:54 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:39.066 20:53:54 -- event/cpu_locks.sh@22 -- # lslocks -p 180710 00:06:39.634 20:53:55 -- event/cpu_locks.sh@73 -- # killprocess 180710 00:06:39.634 20:53:55 -- common/autotest_common.sh@936 -- # '[' -z 180710 ']' 00:06:39.634 20:53:55 -- common/autotest_common.sh@940 -- # kill -0 180710 00:06:39.634 20:53:55 -- common/autotest_common.sh@941 -- # uname 00:06:39.634 20:53:55 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:39.634 20:53:55 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 180710 00:06:39.634 20:53:55 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:39.634 20:53:55 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:39.634 20:53:55 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 180710' 00:06:39.634 killing process with pid 180710 00:06:39.634 20:53:55 -- common/autotest_common.sh@955 -- # kill 180710 00:06:39.634 20:53:55 -- common/autotest_common.sh@960 -- # wait 180710 00:06:39.893 00:06:39.893 real 0m1.253s 00:06:39.893 user 0m1.202s 00:06:39.893 sys 0m0.602s 00:06:39.893 20:53:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:39.893 20:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:39.893 ************************************ 00:06:39.893 END TEST default_locks_via_rpc 00:06:39.893 ************************************ 00:06:40.152 20:53:55 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:40.152 20:53:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:40.152 20:53:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.152 20:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:40.152 ************************************ 00:06:40.152 START TEST non_locking_app_on_locked_coremask 00:06:40.152 ************************************ 00:06:40.152 20:53:55 -- common/autotest_common.sh@1111 -- # non_locking_app_on_locked_coremask 00:06:40.152 20:53:55 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=180967 00:06:40.152 20:53:55 -- event/cpu_locks.sh@81 -- # waitforlisten 180967 /var/tmp/spdk.sock 00:06:40.152 20:53:55 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:40.152 20:53:55 -- common/autotest_common.sh@817 -- # '[' -z 180967 ']' 00:06:40.152 20:53:55 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:40.152 20:53:55 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:40.153 20:53:55 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:40.153 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:40.153 20:53:55 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:40.153 20:53:55 -- common/autotest_common.sh@10 -- # set +x 00:06:40.153 [2024-04-25 20:53:55.767902] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:40.153 [2024-04-25 20:53:55.767969] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid180967 ] 00:06:40.153 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.153 [2024-04-25 20:53:55.805461] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:40.412 [2024-04-25 20:53:55.839236] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.412 [2024-04-25 20:53:55.879564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.412 20:53:56 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:40.412 20:53:56 -- common/autotest_common.sh@850 -- # return 0 00:06:40.412 20:53:56 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=181139 00:06:40.412 20:53:56 -- event/cpu_locks.sh@85 -- # waitforlisten 181139 /var/tmp/spdk2.sock 00:06:40.412 20:53:56 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:40.412 20:53:56 -- common/autotest_common.sh@817 -- # '[' -z 181139 ']' 00:06:40.412 20:53:56 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:40.412 20:53:56 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:40.412 20:53:56 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:40.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:40.412 20:53:56 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:40.412 20:53:56 -- common/autotest_common.sh@10 -- # set +x 00:06:40.671 [2024-04-25 20:53:56.090251] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:40.671 [2024-04-25 20:53:56.090316] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid181139 ] 00:06:40.671 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.671 [2024-04-25 20:53:56.129479] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:40.671 [2024-04-25 20:53:56.180512] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:40.671 [2024-04-25 20:53:56.180533] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.671 [2024-04-25 20:53:56.252139] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.306 20:53:56 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:41.306 20:53:56 -- common/autotest_common.sh@850 -- # return 0 00:06:41.306 20:53:56 -- event/cpu_locks.sh@87 -- # locks_exist 180967 00:06:41.306 20:53:56 -- event/cpu_locks.sh@22 -- # lslocks -p 180967 00:06:41.306 20:53:56 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:42.268 lslocks: write error 00:06:42.268 20:53:57 -- event/cpu_locks.sh@89 -- # killprocess 180967 00:06:42.269 20:53:57 -- common/autotest_common.sh@936 -- # '[' -z 180967 ']' 00:06:42.269 20:53:57 -- common/autotest_common.sh@940 -- # kill -0 180967 00:06:42.269 20:53:57 -- common/autotest_common.sh@941 -- # uname 00:06:42.269 20:53:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:42.269 20:53:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 180967 00:06:42.269 20:53:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:42.269 20:53:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:42.269 20:53:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 180967' 00:06:42.269 killing process with pid 180967 00:06:42.269 20:53:57 -- common/autotest_common.sh@955 -- # kill 180967 00:06:42.269 20:53:57 -- common/autotest_common.sh@960 -- # wait 180967 00:06:42.837 20:53:58 -- event/cpu_locks.sh@90 -- # killprocess 181139 00:06:42.837 20:53:58 -- common/autotest_common.sh@936 -- # '[' -z 181139 ']' 00:06:42.837 20:53:58 -- common/autotest_common.sh@940 -- # kill -0 181139 00:06:42.837 20:53:58 -- common/autotest_common.sh@941 -- # uname 00:06:42.837 20:53:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:42.837 20:53:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 181139 00:06:42.837 20:53:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:42.837 20:53:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:42.837 20:53:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 181139' 00:06:42.837 killing process with pid 181139 00:06:42.837 20:53:58 -- common/autotest_common.sh@955 -- # kill 181139 00:06:42.837 20:53:58 -- common/autotest_common.sh@960 -- # wait 181139 00:06:43.406 00:06:43.406 real 0m3.018s 00:06:43.406 user 0m3.100s 00:06:43.406 sys 0m1.140s 00:06:43.406 20:53:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:43.406 20:53:58 -- common/autotest_common.sh@10 -- # set +x 00:06:43.406 ************************************ 00:06:43.406 END TEST non_locking_app_on_locked_coremask 00:06:43.406 ************************************ 00:06:43.406 20:53:58 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:43.406 20:53:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:43.406 20:53:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.406 20:53:58 -- common/autotest_common.sh@10 -- # set +x 00:06:43.406 ************************************ 00:06:43.406 START TEST locking_app_on_unlocked_coremask 00:06:43.406 ************************************ 00:06:43.406 20:53:58 -- common/autotest_common.sh@1111 -- # locking_app_on_unlocked_coremask 00:06:43.406 20:53:58 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:43.406 20:53:58 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=181632 00:06:43.406 20:53:58 -- event/cpu_locks.sh@99 -- # waitforlisten 181632 /var/tmp/spdk.sock 00:06:43.406 20:53:58 -- common/autotest_common.sh@817 -- # '[' -z 181632 ']' 00:06:43.406 20:53:58 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:43.406 20:53:58 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:43.406 20:53:58 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:43.406 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:43.406 20:53:58 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:43.406 20:53:58 -- common/autotest_common.sh@10 -- # set +x 00:06:43.406 [2024-04-25 20:53:58.971397] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:43.406 [2024-04-25 20:53:58.971453] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid181632 ] 00:06:43.406 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.406 [2024-04-25 20:53:59.006233] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:43.406 [2024-04-25 20:53:59.036724] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:43.406 [2024-04-25 20:53:59.036745] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.665 [2024-04-25 20:53:59.075026] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.665 20:53:59 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:43.665 20:53:59 -- common/autotest_common.sh@850 -- # return 0 00:06:43.665 20:53:59 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=181764 00:06:43.665 20:53:59 -- event/cpu_locks.sh@103 -- # waitforlisten 181764 /var/tmp/spdk2.sock 00:06:43.665 20:53:59 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:43.665 20:53:59 -- common/autotest_common.sh@817 -- # '[' -z 181764 ']' 00:06:43.665 20:53:59 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:43.665 20:53:59 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:43.665 20:53:59 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:43.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:43.665 20:53:59 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:43.665 20:53:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.665 [2024-04-25 20:53:59.286342] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:43.665 [2024-04-25 20:53:59.286428] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid181764 ] 00:06:43.665 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.665 [2024-04-25 20:53:59.325473] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:43.925 [2024-04-25 20:53:59.380387] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.925 [2024-04-25 20:53:59.453012] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.493 20:54:00 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:44.493 20:54:00 -- common/autotest_common.sh@850 -- # return 0 00:06:44.493 20:54:00 -- event/cpu_locks.sh@105 -- # locks_exist 181764 00:06:44.493 20:54:00 -- event/cpu_locks.sh@22 -- # lslocks -p 181764 00:06:44.493 20:54:00 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:45.873 lslocks: write error 00:06:45.873 20:54:01 -- event/cpu_locks.sh@107 -- # killprocess 181632 00:06:45.873 20:54:01 -- common/autotest_common.sh@936 -- # '[' -z 181632 ']' 00:06:45.873 20:54:01 -- common/autotest_common.sh@940 -- # kill -0 181632 00:06:45.873 20:54:01 -- common/autotest_common.sh@941 -- # uname 00:06:45.873 20:54:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:45.873 20:54:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 181632 00:06:45.873 20:54:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:45.873 20:54:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:45.873 20:54:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 181632' 00:06:45.873 killing process with pid 181632 00:06:45.873 20:54:01 -- common/autotest_common.sh@955 -- # kill 181632 00:06:45.873 20:54:01 -- common/autotest_common.sh@960 -- # wait 181632 00:06:46.132 20:54:01 -- event/cpu_locks.sh@108 -- # killprocess 181764 00:06:46.132 20:54:01 -- common/autotest_common.sh@936 -- # '[' -z 181764 ']' 00:06:46.132 20:54:01 -- common/autotest_common.sh@940 -- # kill -0 181764 00:06:46.132 20:54:01 -- common/autotest_common.sh@941 -- # uname 00:06:46.132 20:54:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:46.132 20:54:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 181764 00:06:46.132 20:54:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:46.132 20:54:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:46.132 20:54:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 181764' 00:06:46.132 killing process with pid 181764 00:06:46.132 20:54:01 -- common/autotest_common.sh@955 -- # kill 181764 00:06:46.132 20:54:01 -- common/autotest_common.sh@960 -- # wait 181764 00:06:46.700 00:06:46.700 real 0m3.115s 00:06:46.700 user 0m3.188s 00:06:46.700 sys 0m1.170s 00:06:46.700 20:54:02 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:46.700 20:54:02 -- common/autotest_common.sh@10 -- # set +x 00:06:46.700 ************************************ 00:06:46.700 END TEST locking_app_on_unlocked_coremask 00:06:46.700 ************************************ 00:06:46.700 20:54:02 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:46.700 20:54:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:46.700 20:54:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:46.700 20:54:02 -- common/autotest_common.sh@10 -- # set +x 00:06:46.700 ************************************ 00:06:46.700 START TEST locking_app_on_locked_coremask 00:06:46.700 ************************************ 00:06:46.700 20:54:02 -- common/autotest_common.sh@1111 -- # locking_app_on_locked_coremask 00:06:46.700 20:54:02 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:46.700 20:54:02 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=182457 00:06:46.700 20:54:02 -- event/cpu_locks.sh@116 -- # waitforlisten 182457 /var/tmp/spdk.sock 00:06:46.700 20:54:02 -- common/autotest_common.sh@817 -- # '[' -z 182457 ']' 00:06:46.700 20:54:02 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:46.700 20:54:02 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:46.700 20:54:02 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:46.700 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:46.700 20:54:02 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:46.700 20:54:02 -- common/autotest_common.sh@10 -- # set +x 00:06:46.700 [2024-04-25 20:54:02.279437] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:46.700 [2024-04-25 20:54:02.279488] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid182457 ] 00:06:46.700 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.700 [2024-04-25 20:54:02.314036] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:46.700 [2024-04-25 20:54:02.344870] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.960 [2024-04-25 20:54:02.383715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.960 20:54:02 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:46.960 20:54:02 -- common/autotest_common.sh@850 -- # return 0 00:06:46.960 20:54:02 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:46.960 20:54:02 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=182465 00:06:46.960 20:54:02 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 182465 /var/tmp/spdk2.sock 00:06:46.960 20:54:02 -- common/autotest_common.sh@638 -- # local es=0 00:06:46.960 20:54:02 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 182465 /var/tmp/spdk2.sock 00:06:46.960 20:54:02 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:46.960 20:54:02 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:46.960 20:54:02 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:46.960 20:54:02 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:46.960 20:54:02 -- common/autotest_common.sh@641 -- # waitforlisten 182465 /var/tmp/spdk2.sock 00:06:46.960 20:54:02 -- common/autotest_common.sh@817 -- # '[' -z 182465 ']' 00:06:46.960 20:54:02 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:46.960 20:54:02 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:46.960 20:54:02 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:46.960 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:46.960 20:54:02 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:46.960 20:54:02 -- common/autotest_common.sh@10 -- # set +x 00:06:46.960 [2024-04-25 20:54:02.573934] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:46.960 [2024-04-25 20:54:02.573987] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid182465 ] 00:06:46.960 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.960 [2024-04-25 20:54:02.610432] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:47.219 [2024-04-25 20:54:02.660467] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 182457 has claimed it. 00:06:47.219 [2024-04-25 20:54:02.660498] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:47.785 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (182465) - No such process 00:06:47.785 ERROR: process (pid: 182465) is no longer running 00:06:47.785 20:54:03 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:47.785 20:54:03 -- common/autotest_common.sh@850 -- # return 1 00:06:47.785 20:54:03 -- common/autotest_common.sh@641 -- # es=1 00:06:47.785 20:54:03 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:47.785 20:54:03 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:47.785 20:54:03 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:47.785 20:54:03 -- event/cpu_locks.sh@122 -- # locks_exist 182457 00:06:47.785 20:54:03 -- event/cpu_locks.sh@22 -- # lslocks -p 182457 00:06:47.785 20:54:03 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:48.353 lslocks: write error 00:06:48.353 20:54:03 -- event/cpu_locks.sh@124 -- # killprocess 182457 00:06:48.353 20:54:03 -- common/autotest_common.sh@936 -- # '[' -z 182457 ']' 00:06:48.353 20:54:03 -- common/autotest_common.sh@940 -- # kill -0 182457 00:06:48.353 20:54:03 -- common/autotest_common.sh@941 -- # uname 00:06:48.353 20:54:03 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:48.353 20:54:03 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 182457 00:06:48.353 20:54:03 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:48.353 20:54:03 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:48.353 20:54:03 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 182457' 00:06:48.353 killing process with pid 182457 00:06:48.353 20:54:03 -- common/autotest_common.sh@955 -- # kill 182457 00:06:48.353 20:54:03 -- common/autotest_common.sh@960 -- # wait 182457 00:06:48.618 00:06:48.618 real 0m1.921s 00:06:48.618 user 0m2.008s 00:06:48.618 sys 0m0.723s 00:06:48.618 20:54:04 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:48.618 20:54:04 -- common/autotest_common.sh@10 -- # set +x 00:06:48.618 ************************************ 00:06:48.618 END TEST locking_app_on_locked_coremask 00:06:48.618 ************************************ 00:06:48.618 20:54:04 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:48.618 20:54:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:48.618 20:54:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:48.618 20:54:04 -- common/autotest_common.sh@10 -- # set +x 00:06:48.878 ************************************ 00:06:48.878 START TEST locking_overlapped_coremask 00:06:48.878 ************************************ 00:06:48.878 20:54:04 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask 00:06:48.878 20:54:04 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:48.878 20:54:04 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=182948 00:06:48.878 20:54:04 -- event/cpu_locks.sh@133 -- # waitforlisten 182948 /var/tmp/spdk.sock 00:06:48.878 20:54:04 -- common/autotest_common.sh@817 -- # '[' -z 182948 ']' 00:06:48.878 20:54:04 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.878 20:54:04 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:48.878 20:54:04 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.878 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.878 20:54:04 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:48.878 20:54:04 -- common/autotest_common.sh@10 -- # set +x 00:06:48.878 [2024-04-25 20:54:04.404697] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:48.878 [2024-04-25 20:54:04.404755] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid182948 ] 00:06:48.878 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.878 [2024-04-25 20:54:04.440262] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:48.878 [2024-04-25 20:54:04.471727] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:48.878 [2024-04-25 20:54:04.511665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:48.878 [2024-04-25 20:54:04.515009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:48.878 [2024-04-25 20:54:04.515012] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.137 20:54:04 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:49.137 20:54:04 -- common/autotest_common.sh@850 -- # return 0 00:06:49.137 20:54:04 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=183017 00:06:49.137 20:54:04 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 183017 /var/tmp/spdk2.sock 00:06:49.137 20:54:04 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:49.137 20:54:04 -- common/autotest_common.sh@638 -- # local es=0 00:06:49.137 20:54:04 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 183017 /var/tmp/spdk2.sock 00:06:49.137 20:54:04 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:49.137 20:54:04 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:49.137 20:54:04 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:49.137 20:54:04 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:49.137 20:54:04 -- common/autotest_common.sh@641 -- # waitforlisten 183017 /var/tmp/spdk2.sock 00:06:49.137 20:54:04 -- common/autotest_common.sh@817 -- # '[' -z 183017 ']' 00:06:49.137 20:54:04 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:49.137 20:54:04 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:49.137 20:54:04 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:49.137 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:49.137 20:54:04 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:49.137 20:54:04 -- common/autotest_common.sh@10 -- # set +x 00:06:49.137 [2024-04-25 20:54:04.737977] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:49.137 [2024-04-25 20:54:04.738075] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid183017 ] 00:06:49.137 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.137 [2024-04-25 20:54:04.779775] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:49.395 [2024-04-25 20:54:04.836115] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 182948 has claimed it. 00:06:49.396 [2024-04-25 20:54:04.836150] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:49.963 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (183017) - No such process 00:06:49.963 ERROR: process (pid: 183017) is no longer running 00:06:49.963 20:54:05 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:49.963 20:54:05 -- common/autotest_common.sh@850 -- # return 1 00:06:49.963 20:54:05 -- common/autotest_common.sh@641 -- # es=1 00:06:49.963 20:54:05 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:49.963 20:54:05 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:49.963 20:54:05 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:49.963 20:54:05 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:49.963 20:54:05 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:49.963 20:54:05 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:49.963 20:54:05 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:49.963 20:54:05 -- event/cpu_locks.sh@141 -- # killprocess 182948 00:06:49.963 20:54:05 -- common/autotest_common.sh@936 -- # '[' -z 182948 ']' 00:06:49.963 20:54:05 -- common/autotest_common.sh@940 -- # kill -0 182948 00:06:49.963 20:54:05 -- common/autotest_common.sh@941 -- # uname 00:06:49.963 20:54:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:49.963 20:54:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 182948 00:06:49.963 20:54:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:49.963 20:54:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:49.963 20:54:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 182948' 00:06:49.963 killing process with pid 182948 00:06:49.963 20:54:05 -- common/autotest_common.sh@955 -- # kill 182948 00:06:49.963 20:54:05 -- common/autotest_common.sh@960 -- # wait 182948 00:06:50.238 00:06:50.239 real 0m1.350s 00:06:50.239 user 0m3.720s 00:06:50.239 sys 0m0.408s 00:06:50.239 20:54:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:50.239 20:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:50.239 ************************************ 00:06:50.239 END TEST locking_overlapped_coremask 00:06:50.239 ************************************ 00:06:50.239 20:54:05 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:50.239 20:54:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:50.239 20:54:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:50.239 20:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:50.239 ************************************ 00:06:50.239 START TEST locking_overlapped_coremask_via_rpc 00:06:50.239 ************************************ 00:06:50.239 20:54:05 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask_via_rpc 00:06:50.239 20:54:05 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:50.239 20:54:05 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=183503 00:06:50.239 20:54:05 -- event/cpu_locks.sh@149 -- # waitforlisten 183503 /var/tmp/spdk.sock 00:06:50.239 20:54:05 -- common/autotest_common.sh@817 -- # '[' -z 183503 ']' 00:06:50.239 20:54:05 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.239 20:54:05 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:50.239 20:54:05 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.239 20:54:05 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:50.239 20:54:05 -- common/autotest_common.sh@10 -- # set +x 00:06:50.498 [2024-04-25 20:54:05.903946] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:50.498 [2024-04-25 20:54:05.904004] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid183503 ] 00:06:50.498 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.498 [2024-04-25 20:54:05.940435] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:50.498 [2024-04-25 20:54:05.971729] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:50.498 [2024-04-25 20:54:05.971749] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:50.498 [2024-04-25 20:54:06.010476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.498 [2024-04-25 20:54:06.010496] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:50.498 [2024-04-25 20:54:06.010500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.757 20:54:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:50.757 20:54:06 -- common/autotest_common.sh@850 -- # return 0 00:06:50.757 20:54:06 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=183522 00:06:50.757 20:54:06 -- event/cpu_locks.sh@153 -- # waitforlisten 183522 /var/tmp/spdk2.sock 00:06:50.757 20:54:06 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:50.757 20:54:06 -- common/autotest_common.sh@817 -- # '[' -z 183522 ']' 00:06:50.757 20:54:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:50.757 20:54:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:50.757 20:54:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:50.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:50.757 20:54:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:50.757 20:54:06 -- common/autotest_common.sh@10 -- # set +x 00:06:50.757 [2024-04-25 20:54:06.212169] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:50.757 [2024-04-25 20:54:06.212238] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid183522 ] 00:06:50.757 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.757 [2024-04-25 20:54:06.250762] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:50.757 [2024-04-25 20:54:06.305974] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:50.757 [2024-04-25 20:54:06.306002] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:50.757 [2024-04-25 20:54:06.379817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:50.757 [2024-04-25 20:54:06.383050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:50.757 [2024-04-25 20:54:06.383051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:51.694 20:54:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:51.694 20:54:07 -- common/autotest_common.sh@850 -- # return 0 00:06:51.694 20:54:07 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:51.694 20:54:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:51.694 20:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.694 20:54:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:51.694 20:54:07 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:51.694 20:54:07 -- common/autotest_common.sh@638 -- # local es=0 00:06:51.694 20:54:07 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:51.694 20:54:07 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:06:51.694 20:54:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:51.694 20:54:07 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:06:51.694 20:54:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:51.694 20:54:07 -- common/autotest_common.sh@641 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:51.694 20:54:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:51.694 20:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.694 [2024-04-25 20:54:07.047058] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 183503 has claimed it. 00:06:51.694 request: 00:06:51.694 { 00:06:51.694 "method": "framework_enable_cpumask_locks", 00:06:51.694 "req_id": 1 00:06:51.694 } 00:06:51.694 Got JSON-RPC error response 00:06:51.694 response: 00:06:51.694 { 00:06:51.694 "code": -32603, 00:06:51.694 "message": "Failed to claim CPU core: 2" 00:06:51.694 } 00:06:51.694 20:54:07 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:06:51.694 20:54:07 -- common/autotest_common.sh@641 -- # es=1 00:06:51.694 20:54:07 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:51.694 20:54:07 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:51.694 20:54:07 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:51.694 20:54:07 -- event/cpu_locks.sh@158 -- # waitforlisten 183503 /var/tmp/spdk.sock 00:06:51.694 20:54:07 -- common/autotest_common.sh@817 -- # '[' -z 183503 ']' 00:06:51.694 20:54:07 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.694 20:54:07 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:51.694 20:54:07 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.694 20:54:07 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:51.694 20:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.694 20:54:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:51.694 20:54:07 -- common/autotest_common.sh@850 -- # return 0 00:06:51.694 20:54:07 -- event/cpu_locks.sh@159 -- # waitforlisten 183522 /var/tmp/spdk2.sock 00:06:51.694 20:54:07 -- common/autotest_common.sh@817 -- # '[' -z 183522 ']' 00:06:51.694 20:54:07 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:51.694 20:54:07 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:51.694 20:54:07 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:51.694 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:51.694 20:54:07 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:51.694 20:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.954 20:54:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:51.954 20:54:07 -- common/autotest_common.sh@850 -- # return 0 00:06:51.954 20:54:07 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:51.954 20:54:07 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:51.954 20:54:07 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:51.954 20:54:07 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:51.954 00:06:51.954 real 0m1.539s 00:06:51.954 user 0m0.707s 00:06:51.954 sys 0m0.133s 00:06:51.954 20:54:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:51.954 20:54:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.954 ************************************ 00:06:51.954 END TEST locking_overlapped_coremask_via_rpc 00:06:51.954 ************************************ 00:06:51.954 20:54:07 -- event/cpu_locks.sh@174 -- # cleanup 00:06:51.954 20:54:07 -- event/cpu_locks.sh@15 -- # [[ -z 183503 ]] 00:06:51.954 20:54:07 -- event/cpu_locks.sh@15 -- # killprocess 183503 00:06:51.954 20:54:07 -- common/autotest_common.sh@936 -- # '[' -z 183503 ']' 00:06:51.954 20:54:07 -- common/autotest_common.sh@940 -- # kill -0 183503 00:06:51.954 20:54:07 -- common/autotest_common.sh@941 -- # uname 00:06:51.954 20:54:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:51.954 20:54:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 183503 00:06:51.954 20:54:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:51.954 20:54:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:51.954 20:54:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 183503' 00:06:51.954 killing process with pid 183503 00:06:51.954 20:54:07 -- common/autotest_common.sh@955 -- # kill 183503 00:06:51.954 20:54:07 -- common/autotest_common.sh@960 -- # wait 183503 00:06:52.213 20:54:07 -- event/cpu_locks.sh@16 -- # [[ -z 183522 ]] 00:06:52.213 20:54:07 -- event/cpu_locks.sh@16 -- # killprocess 183522 00:06:52.213 20:54:07 -- common/autotest_common.sh@936 -- # '[' -z 183522 ']' 00:06:52.213 20:54:07 -- common/autotest_common.sh@940 -- # kill -0 183522 00:06:52.213 20:54:07 -- common/autotest_common.sh@941 -- # uname 00:06:52.213 20:54:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:52.213 20:54:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 183522 00:06:52.472 20:54:07 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:52.472 20:54:07 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:52.472 20:54:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 183522' 00:06:52.472 killing process with pid 183522 00:06:52.472 20:54:07 -- common/autotest_common.sh@955 -- # kill 183522 00:06:52.472 20:54:07 -- common/autotest_common.sh@960 -- # wait 183522 00:06:52.731 20:54:08 -- event/cpu_locks.sh@18 -- # rm -f 00:06:52.731 20:54:08 -- event/cpu_locks.sh@1 -- # cleanup 00:06:52.731 20:54:08 -- event/cpu_locks.sh@15 -- # [[ -z 183503 ]] 00:06:52.731 20:54:08 -- event/cpu_locks.sh@15 -- # killprocess 183503 00:06:52.731 20:54:08 -- common/autotest_common.sh@936 -- # '[' -z 183503 ']' 00:06:52.731 20:54:08 -- common/autotest_common.sh@940 -- # kill -0 183503 00:06:52.731 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (183503) - No such process 00:06:52.731 20:54:08 -- common/autotest_common.sh@963 -- # echo 'Process with pid 183503 is not found' 00:06:52.731 Process with pid 183503 is not found 00:06:52.731 20:54:08 -- event/cpu_locks.sh@16 -- # [[ -z 183522 ]] 00:06:52.731 20:54:08 -- event/cpu_locks.sh@16 -- # killprocess 183522 00:06:52.731 20:54:08 -- common/autotest_common.sh@936 -- # '[' -z 183522 ']' 00:06:52.731 20:54:08 -- common/autotest_common.sh@940 -- # kill -0 183522 00:06:52.731 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (183522) - No such process 00:06:52.731 20:54:08 -- common/autotest_common.sh@963 -- # echo 'Process with pid 183522 is not found' 00:06:52.731 Process with pid 183522 is not found 00:06:52.731 20:54:08 -- event/cpu_locks.sh@18 -- # rm -f 00:06:52.731 00:06:52.731 real 0m15.757s 00:06:52.731 user 0m24.705s 00:06:52.731 sys 0m6.214s 00:06:52.731 20:54:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:52.731 20:54:08 -- common/autotest_common.sh@10 -- # set +x 00:06:52.731 ************************************ 00:06:52.731 END TEST cpu_locks 00:06:52.731 ************************************ 00:06:52.731 00:06:52.731 real 0m40.399s 00:06:52.731 user 1m11.550s 00:06:52.731 sys 0m10.792s 00:06:52.731 20:54:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:52.731 20:54:08 -- common/autotest_common.sh@10 -- # set +x 00:06:52.731 ************************************ 00:06:52.731 END TEST event 00:06:52.731 ************************************ 00:06:52.731 20:54:08 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:52.731 20:54:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:52.731 20:54:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.731 20:54:08 -- common/autotest_common.sh@10 -- # set +x 00:06:52.989 ************************************ 00:06:52.989 START TEST thread 00:06:52.989 ************************************ 00:06:52.989 20:54:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:52.989 * Looking for test storage... 00:06:52.989 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:52.989 20:54:08 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:52.989 20:54:08 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:52.989 20:54:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.989 20:54:08 -- common/autotest_common.sh@10 -- # set +x 00:06:53.247 ************************************ 00:06:53.247 START TEST thread_poller_perf 00:06:53.247 ************************************ 00:06:53.247 20:54:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:53.247 [2024-04-25 20:54:08.713635] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:53.247 [2024-04-25 20:54:08.713719] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid184157 ] 00:06:53.247 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.247 [2024-04-25 20:54:08.753509] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:53.247 [2024-04-25 20:54:08.784340] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.247 [2024-04-25 20:54:08.822309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.247 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:54.620 ====================================== 00:06:54.620 busy:2505312372 (cyc) 00:06:54.620 total_run_count: 865000 00:06:54.620 tsc_hz: 2500000000 (cyc) 00:06:54.620 ====================================== 00:06:54.620 poller_cost: 2896 (cyc), 1158 (nsec) 00:06:54.620 00:06:54.620 real 0m1.181s 00:06:54.620 user 0m1.083s 00:06:54.620 sys 0m0.094s 00:06:54.620 20:54:09 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:54.620 20:54:09 -- common/autotest_common.sh@10 -- # set +x 00:06:54.620 ************************************ 00:06:54.620 END TEST thread_poller_perf 00:06:54.620 ************************************ 00:06:54.620 20:54:09 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:54.620 20:54:09 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:54.620 20:54:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.620 20:54:09 -- common/autotest_common.sh@10 -- # set +x 00:06:54.620 ************************************ 00:06:54.620 START TEST thread_poller_perf 00:06:54.620 ************************************ 00:06:54.620 20:54:10 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:54.620 [2024-04-25 20:54:10.081013] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:54.620 [2024-04-25 20:54:10.081100] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid184443 ] 00:06:54.620 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.620 [2024-04-25 20:54:10.121551] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:54.620 [2024-04-25 20:54:10.153331] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.620 [2024-04-25 20:54:10.190147] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.620 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:55.994 ====================================== 00:06:55.994 busy:2501332738 (cyc) 00:06:55.994 total_run_count: 13270000 00:06:55.994 tsc_hz: 2500000000 (cyc) 00:06:55.994 ====================================== 00:06:55.994 poller_cost: 188 (cyc), 75 (nsec) 00:06:55.994 00:06:55.994 real 0m1.181s 00:06:55.994 user 0m1.099s 00:06:55.994 sys 0m0.079s 00:06:55.994 20:54:11 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:55.994 20:54:11 -- common/autotest_common.sh@10 -- # set +x 00:06:55.994 ************************************ 00:06:55.994 END TEST thread_poller_perf 00:06:55.994 ************************************ 00:06:55.994 20:54:11 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:55.994 20:54:11 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:55.994 20:54:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:55.994 20:54:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.994 20:54:11 -- common/autotest_common.sh@10 -- # set +x 00:06:55.994 ************************************ 00:06:55.994 START TEST thread_spdk_lock 00:06:55.994 ************************************ 00:06:55.994 20:54:11 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:55.994 [2024-04-25 20:54:11.467850] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:55.994 [2024-04-25 20:54:11.467949] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid184716 ] 00:06:55.994 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.994 [2024-04-25 20:54:11.508987] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:55.994 [2024-04-25 20:54:11.541804] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:55.994 [2024-04-25 20:54:11.579493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.994 [2024-04-25 20:54:11.579495] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.559 [2024-04-25 20:54:12.065211] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.559 [2024-04-25 20:54:12.065250] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:56.559 [2024-04-25 20:54:12.065261] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x135e600 00:06:56.559 [2024-04-25 20:54:12.066155] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.559 [2024-04-25 20:54:12.066259] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.559 [2024-04-25 20:54:12.066281] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:56.559 Starting test contend 00:06:56.559 Worker Delay Wait us Hold us Total us 00:06:56.559 0 3 166371 183911 350282 00:06:56.559 1 5 83536 284120 367657 00:06:56.559 PASS test contend 00:06:56.559 Starting test hold_by_poller 00:06:56.559 PASS test hold_by_poller 00:06:56.559 Starting test hold_by_message 00:06:56.559 PASS test hold_by_message 00:06:56.559 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:56.559 100014 assertions passed 00:06:56.559 0 assertions failed 00:06:56.559 00:06:56.559 real 0m0.667s 00:06:56.559 user 0m1.060s 00:06:56.559 sys 0m0.091s 00:06:56.559 20:54:12 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:56.559 20:54:12 -- common/autotest_common.sh@10 -- # set +x 00:06:56.559 ************************************ 00:06:56.559 END TEST thread_spdk_lock 00:06:56.559 ************************************ 00:06:56.559 00:06:56.559 real 0m3.729s 00:06:56.559 user 0m3.476s 00:06:56.559 sys 0m0.686s 00:06:56.559 20:54:12 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:56.559 20:54:12 -- common/autotest_common.sh@10 -- # set +x 00:06:56.559 ************************************ 00:06:56.559 END TEST thread 00:06:56.559 ************************************ 00:06:56.559 20:54:12 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:56.559 20:54:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:56.559 20:54:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:56.559 20:54:12 -- common/autotest_common.sh@10 -- # set +x 00:06:56.817 ************************************ 00:06:56.817 START TEST accel 00:06:56.817 ************************************ 00:06:56.817 20:54:12 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:56.817 * Looking for test storage... 00:06:56.817 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:56.817 20:54:12 -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:56.817 20:54:12 -- accel/accel.sh@82 -- # get_expected_opcs 00:06:56.817 20:54:12 -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:56.817 20:54:12 -- accel/accel.sh@62 -- # spdk_tgt_pid=184821 00:06:56.817 20:54:12 -- accel/accel.sh@63 -- # waitforlisten 184821 00:06:56.817 20:54:12 -- common/autotest_common.sh@817 -- # '[' -z 184821 ']' 00:06:56.817 20:54:12 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.817 20:54:12 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:56.817 20:54:12 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.817 20:54:12 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:56.817 20:54:12 -- common/autotest_common.sh@10 -- # set +x 00:06:56.817 20:54:12 -- accel/accel.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:56.817 20:54:12 -- accel/accel.sh@61 -- # build_accel_config 00:06:56.817 20:54:12 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:56.817 20:54:12 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:56.817 20:54:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.817 20:54:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.075 20:54:12 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:57.075 20:54:12 -- accel/accel.sh@40 -- # local IFS=, 00:06:57.075 20:54:12 -- accel/accel.sh@41 -- # jq -r . 00:06:57.075 [2024-04-25 20:54:12.491094] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:57.075 [2024-04-25 20:54:12.491159] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid184821 ] 00:06:57.075 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.075 [2024-04-25 20:54:12.526597] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:57.075 [2024-04-25 20:54:12.557235] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.075 [2024-04-25 20:54:12.595874] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.334 20:54:12 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:57.334 20:54:12 -- common/autotest_common.sh@850 -- # return 0 00:06:57.334 20:54:12 -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:57.334 20:54:12 -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:57.334 20:54:12 -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:57.334 20:54:12 -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:57.334 20:54:12 -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:57.334 20:54:12 -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:57.334 20:54:12 -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:57.334 20:54:12 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:57.334 20:54:12 -- common/autotest_common.sh@10 -- # set +x 00:06:57.334 20:54:12 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # IFS== 00:06:57.334 20:54:12 -- accel/accel.sh@72 -- # read -r opc module 00:06:57.334 20:54:12 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:57.334 20:54:12 -- accel/accel.sh@75 -- # killprocess 184821 00:06:57.334 20:54:12 -- common/autotest_common.sh@936 -- # '[' -z 184821 ']' 00:06:57.334 20:54:12 -- common/autotest_common.sh@940 -- # kill -0 184821 00:06:57.334 20:54:12 -- common/autotest_common.sh@941 -- # uname 00:06:57.334 20:54:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:57.334 20:54:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 184821 00:06:57.334 20:54:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:57.334 20:54:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:57.334 20:54:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 184821' 00:06:57.334 killing process with pid 184821 00:06:57.334 20:54:12 -- common/autotest_common.sh@955 -- # kill 184821 00:06:57.334 20:54:12 -- common/autotest_common.sh@960 -- # wait 184821 00:06:57.592 20:54:13 -- accel/accel.sh@76 -- # trap - ERR 00:06:57.592 20:54:13 -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:57.592 20:54:13 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:57.592 20:54:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:57.592 20:54:13 -- common/autotest_common.sh@10 -- # set +x 00:06:57.851 20:54:13 -- common/autotest_common.sh@1111 -- # accel_perf -h 00:06:57.851 20:54:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:57.851 20:54:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.851 20:54:13 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:57.851 20:54:13 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:57.851 20:54:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.851 20:54:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.851 20:54:13 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:57.851 20:54:13 -- accel/accel.sh@40 -- # local IFS=, 00:06:57.851 20:54:13 -- accel/accel.sh@41 -- # jq -r . 00:06:57.851 20:54:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:57.851 20:54:13 -- common/autotest_common.sh@10 -- # set +x 00:06:57.851 20:54:13 -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:57.851 20:54:13 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:57.851 20:54:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:57.851 20:54:13 -- common/autotest_common.sh@10 -- # set +x 00:06:58.111 ************************************ 00:06:58.111 START TEST accel_missing_filename 00:06:58.111 ************************************ 00:06:58.111 20:54:13 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress 00:06:58.111 20:54:13 -- common/autotest_common.sh@638 -- # local es=0 00:06:58.111 20:54:13 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:58.111 20:54:13 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:58.111 20:54:13 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:58.111 20:54:13 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:58.111 20:54:13 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:58.111 20:54:13 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress 00:06:58.111 20:54:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:58.111 20:54:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.111 20:54:13 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.111 20:54:13 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.111 20:54:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.111 20:54:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.111 20:54:13 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.111 20:54:13 -- accel/accel.sh@40 -- # local IFS=, 00:06:58.111 20:54:13 -- accel/accel.sh@41 -- # jq -r . 00:06:58.111 [2024-04-25 20:54:13.576425] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:58.111 [2024-04-25 20:54:13.576543] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185134 ] 00:06:58.111 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.111 [2024-04-25 20:54:13.618688] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:58.111 [2024-04-25 20:54:13.650406] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.111 [2024-04-25 20:54:13.691831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.111 [2024-04-25 20:54:13.733091] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:58.368 [2024-04-25 20:54:13.793110] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:06:58.368 A filename is required. 00:06:58.368 20:54:13 -- common/autotest_common.sh@641 -- # es=234 00:06:58.368 20:54:13 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:58.368 20:54:13 -- common/autotest_common.sh@650 -- # es=106 00:06:58.368 20:54:13 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:58.368 20:54:13 -- common/autotest_common.sh@658 -- # es=1 00:06:58.368 20:54:13 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:58.368 00:06:58.368 real 0m0.299s 00:06:58.368 user 0m0.198s 00:06:58.368 sys 0m0.139s 00:06:58.368 20:54:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:58.368 20:54:13 -- common/autotest_common.sh@10 -- # set +x 00:06:58.368 ************************************ 00:06:58.368 END TEST accel_missing_filename 00:06:58.368 ************************************ 00:06:58.368 20:54:13 -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.368 20:54:13 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:58.368 20:54:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.368 20:54:13 -- common/autotest_common.sh@10 -- # set +x 00:06:58.626 ************************************ 00:06:58.626 START TEST accel_compress_verify 00:06:58.626 ************************************ 00:06:58.626 20:54:14 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.626 20:54:14 -- common/autotest_common.sh@638 -- # local es=0 00:06:58.626 20:54:14 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.626 20:54:14 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:58.626 20:54:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:58.626 20:54:14 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:58.626 20:54:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:58.626 20:54:14 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.626 20:54:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:58.626 20:54:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.626 20:54:14 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.626 20:54:14 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.626 20:54:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.626 20:54:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.626 20:54:14 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.626 20:54:14 -- accel/accel.sh@40 -- # local IFS=, 00:06:58.626 20:54:14 -- accel/accel.sh@41 -- # jq -r . 00:06:58.626 [2024-04-25 20:54:14.073427] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:58.626 [2024-04-25 20:54:14.073504] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185173 ] 00:06:58.626 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.626 [2024-04-25 20:54:14.114036] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:58.626 [2024-04-25 20:54:14.145307] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.626 [2024-04-25 20:54:14.181388] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.626 [2024-04-25 20:54:14.221379] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:58.626 [2024-04-25 20:54:14.281799] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:06:58.884 00:06:58.884 Compression does not support the verify option, aborting. 00:06:58.884 20:54:14 -- common/autotest_common.sh@641 -- # es=161 00:06:58.884 20:54:14 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:58.884 20:54:14 -- common/autotest_common.sh@650 -- # es=33 00:06:58.884 20:54:14 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:58.884 20:54:14 -- common/autotest_common.sh@658 -- # es=1 00:06:58.884 20:54:14 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:58.884 00:06:58.884 real 0m0.291s 00:06:58.884 user 0m0.191s 00:06:58.884 sys 0m0.141s 00:06:58.884 20:54:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:58.884 20:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:58.884 ************************************ 00:06:58.884 END TEST accel_compress_verify 00:06:58.884 ************************************ 00:06:58.884 20:54:14 -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:58.884 20:54:14 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:58.884 20:54:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.884 20:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:59.143 ************************************ 00:06:59.143 START TEST accel_wrong_workload 00:06:59.143 ************************************ 00:06:59.143 20:54:14 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w foobar 00:06:59.143 20:54:14 -- common/autotest_common.sh@638 -- # local es=0 00:06:59.143 20:54:14 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:59.143 20:54:14 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:59.143 20:54:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:59.143 20:54:14 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:59.143 20:54:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:59.143 20:54:14 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w foobar 00:06:59.143 20:54:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:59.143 20:54:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.143 20:54:14 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.143 20:54:14 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.143 20:54:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.143 20:54:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.143 20:54:14 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.143 20:54:14 -- accel/accel.sh@40 -- # local IFS=, 00:06:59.143 20:54:14 -- accel/accel.sh@41 -- # jq -r . 00:06:59.143 Unsupported workload type: foobar 00:06:59.143 [2024-04-25 20:54:14.572325] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:59.143 accel_perf options: 00:06:59.143 [-h help message] 00:06:59.143 [-q queue depth per core] 00:06:59.143 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:59.143 [-T number of threads per core 00:06:59.143 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:59.143 [-t time in seconds] 00:06:59.143 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:59.143 [ dif_verify, , dif_generate, dif_generate_copy 00:06:59.143 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:59.143 [-l for compress/decompress workloads, name of uncompressed input file 00:06:59.143 [-S for crc32c workload, use this seed value (default 0) 00:06:59.143 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:59.143 [-f for fill workload, use this BYTE value (default 255) 00:06:59.143 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:59.143 [-y verify result if this switch is on] 00:06:59.143 [-a tasks to allocate per core (default: same value as -q)] 00:06:59.143 Can be used to spread operations across a wider range of memory. 00:06:59.143 20:54:14 -- common/autotest_common.sh@641 -- # es=1 00:06:59.143 20:54:14 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:59.143 20:54:14 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:59.143 20:54:14 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:59.143 00:06:59.143 real 0m0.026s 00:06:59.143 user 0m0.008s 00:06:59.143 sys 0m0.018s 00:06:59.143 20:54:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:59.143 20:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:59.143 ************************************ 00:06:59.143 END TEST accel_wrong_workload 00:06:59.143 ************************************ 00:06:59.143 Error: writing output failed: Broken pipe 00:06:59.143 20:54:14 -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:59.143 20:54:14 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:59.143 20:54:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.143 20:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:59.143 ************************************ 00:06:59.143 START TEST accel_negative_buffers 00:06:59.143 ************************************ 00:06:59.143 20:54:14 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:59.143 20:54:14 -- common/autotest_common.sh@638 -- # local es=0 00:06:59.143 20:54:14 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:59.143 20:54:14 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:59.143 20:54:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:59.143 20:54:14 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:59.143 20:54:14 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:59.143 20:54:14 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w xor -y -x -1 00:06:59.143 20:54:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:59.143 20:54:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.143 20:54:14 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.143 20:54:14 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.143 20:54:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.143 20:54:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.143 20:54:14 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.143 20:54:14 -- accel/accel.sh@40 -- # local IFS=, 00:06:59.143 20:54:14 -- accel/accel.sh@41 -- # jq -r . 00:06:59.143 -x option must be non-negative. 00:06:59.143 [2024-04-25 20:54:14.804892] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:59.402 accel_perf options: 00:06:59.402 [-h help message] 00:06:59.402 [-q queue depth per core] 00:06:59.402 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:59.402 [-T number of threads per core 00:06:59.402 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:59.402 [-t time in seconds] 00:06:59.402 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:59.402 [ dif_verify, , dif_generate, dif_generate_copy 00:06:59.402 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:59.402 [-l for compress/decompress workloads, name of uncompressed input file 00:06:59.402 [-S for crc32c workload, use this seed value (default 0) 00:06:59.402 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:59.402 [-f for fill workload, use this BYTE value (default 255) 00:06:59.402 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:59.402 [-y verify result if this switch is on] 00:06:59.402 [-a tasks to allocate per core (default: same value as -q)] 00:06:59.402 Can be used to spread operations across a wider range of memory. 00:06:59.402 20:54:14 -- common/autotest_common.sh@641 -- # es=1 00:06:59.402 20:54:14 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:59.402 20:54:14 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:59.402 20:54:14 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:59.402 00:06:59.402 real 0m0.025s 00:06:59.402 user 0m0.013s 00:06:59.402 sys 0m0.012s 00:06:59.402 20:54:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:59.402 20:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:59.402 ************************************ 00:06:59.402 END TEST accel_negative_buffers 00:06:59.402 ************************************ 00:06:59.402 Error: writing output failed: Broken pipe 00:06:59.402 20:54:14 -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:59.402 20:54:14 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:59.402 20:54:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.402 20:54:14 -- common/autotest_common.sh@10 -- # set +x 00:06:59.402 ************************************ 00:06:59.402 START TEST accel_crc32c 00:06:59.402 ************************************ 00:06:59.402 20:54:15 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:59.402 20:54:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.402 20:54:15 -- accel/accel.sh@17 -- # local accel_module 00:06:59.402 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.402 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.402 20:54:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:59.402 20:54:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:59.402 20:54:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.402 20:54:15 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.402 20:54:15 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.402 20:54:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.402 20:54:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.402 20:54:15 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.402 20:54:15 -- accel/accel.sh@40 -- # local IFS=, 00:06:59.402 20:54:15 -- accel/accel.sh@41 -- # jq -r . 00:06:59.402 [2024-04-25 20:54:15.035960] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:06:59.402 [2024-04-25 20:54:15.036040] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185503 ] 00:06:59.662 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.662 [2024-04-25 20:54:15.076146] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:59.662 [2024-04-25 20:54:15.105582] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.662 [2024-04-25 20:54:15.142016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=0x1 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=crc32c 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=32 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=software 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@22 -- # accel_module=software 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=32 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=32 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=1 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val=Yes 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:06:59.662 20:54:15 -- accel/accel.sh@20 -- # val= 00:06:59.662 20:54:15 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # IFS=: 00:06:59.662 20:54:15 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:01.036 20:54:16 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:01.036 20:54:16 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.036 00:07:01.036 real 0m1.289s 00:07:01.036 user 0m1.165s 00:07:01.036 sys 0m0.127s 00:07:01.036 20:54:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:01.036 20:54:16 -- common/autotest_common.sh@10 -- # set +x 00:07:01.036 ************************************ 00:07:01.036 END TEST accel_crc32c 00:07:01.036 ************************************ 00:07:01.036 20:54:16 -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:01.036 20:54:16 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:01.036 20:54:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:01.036 20:54:16 -- common/autotest_common.sh@10 -- # set +x 00:07:01.036 ************************************ 00:07:01.036 START TEST accel_crc32c_C2 00:07:01.036 ************************************ 00:07:01.036 20:54:16 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:01.036 20:54:16 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.036 20:54:16 -- accel/accel.sh@17 -- # local accel_module 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:01.036 20:54:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:01.036 20:54:16 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.036 20:54:16 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.036 20:54:16 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.036 20:54:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.036 20:54:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.036 20:54:16 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.036 20:54:16 -- accel/accel.sh@40 -- # local IFS=, 00:07:01.036 20:54:16 -- accel/accel.sh@41 -- # jq -r . 00:07:01.036 [2024-04-25 20:54:16.531383] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:01.036 [2024-04-25 20:54:16.531481] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid185802 ] 00:07:01.036 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.036 [2024-04-25 20:54:16.569813] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:01.036 [2024-04-25 20:54:16.599312] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.036 [2024-04-25 20:54:16.634752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=0x1 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=crc32c 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=0 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=software 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@22 -- # accel_module=software 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=32 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=32 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=1 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val=Yes 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:01.036 20:54:16 -- accel/accel.sh@20 -- # val= 00:07:01.036 20:54:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # IFS=: 00:07:01.036 20:54:16 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@20 -- # val= 00:07:02.410 20:54:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@20 -- # val= 00:07:02.410 20:54:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@20 -- # val= 00:07:02.410 20:54:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@20 -- # val= 00:07:02.410 20:54:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@20 -- # val= 00:07:02.410 20:54:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@20 -- # val= 00:07:02.410 20:54:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.410 20:54:17 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:02.410 20:54:17 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.410 00:07:02.410 real 0m1.284s 00:07:02.410 user 0m1.160s 00:07:02.410 sys 0m0.128s 00:07:02.410 20:54:17 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:02.410 20:54:17 -- common/autotest_common.sh@10 -- # set +x 00:07:02.410 ************************************ 00:07:02.410 END TEST accel_crc32c_C2 00:07:02.410 ************************************ 00:07:02.410 20:54:17 -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:02.410 20:54:17 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:02.410 20:54:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.410 20:54:17 -- common/autotest_common.sh@10 -- # set +x 00:07:02.410 ************************************ 00:07:02.410 START TEST accel_copy 00:07:02.410 ************************************ 00:07:02.410 20:54:17 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy -y 00:07:02.410 20:54:17 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.410 20:54:17 -- accel/accel.sh@17 -- # local accel_module 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # IFS=: 00:07:02.410 20:54:17 -- accel/accel.sh@19 -- # read -r var val 00:07:02.410 20:54:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:02.410 20:54:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:02.410 20:54:17 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.410 20:54:17 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.410 20:54:17 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.410 20:54:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.410 20:54:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.410 20:54:18 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:02.410 20:54:18 -- accel/accel.sh@40 -- # local IFS=, 00:07:02.410 20:54:18 -- accel/accel.sh@41 -- # jq -r . 00:07:02.410 [2024-04-25 20:54:18.015624] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:02.410 [2024-04-25 20:54:18.015709] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186088 ] 00:07:02.410 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.410 [2024-04-25 20:54:18.054386] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:02.668 [2024-04-25 20:54:18.085785] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.668 [2024-04-25 20:54:18.121850] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val=0x1 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val=copy 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@23 -- # accel_opc=copy 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val=software 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@22 -- # accel_module=software 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val=32 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val=32 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val=1 00:07:02.668 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.668 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.668 20:54:18 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:02.669 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.669 20:54:18 -- accel/accel.sh@20 -- # val=Yes 00:07:02.669 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.669 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.669 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:02.669 20:54:18 -- accel/accel.sh@20 -- # val= 00:07:02.669 20:54:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # IFS=: 00:07:02.669 20:54:18 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.042 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.042 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.042 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.042 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.042 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.042 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:04.042 20:54:19 -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:04.042 20:54:19 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.042 00:07:04.042 real 0m1.288s 00:07:04.042 user 0m1.160s 00:07:04.042 sys 0m0.131s 00:07:04.042 20:54:19 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:04.042 20:54:19 -- common/autotest_common.sh@10 -- # set +x 00:07:04.042 ************************************ 00:07:04.042 END TEST accel_copy 00:07:04.042 ************************************ 00:07:04.042 20:54:19 -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:04.042 20:54:19 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:04.042 20:54:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.042 20:54:19 -- common/autotest_common.sh@10 -- # set +x 00:07:04.042 ************************************ 00:07:04.042 START TEST accel_fill 00:07:04.042 ************************************ 00:07:04.042 20:54:19 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:04.042 20:54:19 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.042 20:54:19 -- accel/accel.sh@17 -- # local accel_module 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.042 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.042 20:54:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:04.043 20:54:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:04.043 20:54:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.043 20:54:19 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.043 20:54:19 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.043 20:54:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.043 20:54:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.043 20:54:19 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.043 20:54:19 -- accel/accel.sh@40 -- # local IFS=, 00:07:04.043 20:54:19 -- accel/accel.sh@41 -- # jq -r . 00:07:04.043 [2024-04-25 20:54:19.508022] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:04.043 [2024-04-25 20:54:19.508105] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186386 ] 00:07:04.043 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.043 [2024-04-25 20:54:19.548700] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:04.043 [2024-04-25 20:54:19.579996] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.043 [2024-04-25 20:54:19.618285] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=0x1 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=fill 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@23 -- # accel_opc=fill 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=0x80 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=software 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@22 -- # accel_module=software 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=64 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=64 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=1 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val=Yes 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:04.043 20:54:19 -- accel/accel.sh@20 -- # val= 00:07:04.043 20:54:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # IFS=: 00:07:04.043 20:54:19 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@20 -- # val= 00:07:05.417 20:54:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@20 -- # val= 00:07:05.417 20:54:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@20 -- # val= 00:07:05.417 20:54:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@20 -- # val= 00:07:05.417 20:54:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@20 -- # val= 00:07:05.417 20:54:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@20 -- # val= 00:07:05.417 20:54:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:05.417 20:54:20 -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:05.417 20:54:20 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.417 00:07:05.417 real 0m1.296s 00:07:05.417 user 0m1.166s 00:07:05.417 sys 0m0.134s 00:07:05.417 20:54:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:05.417 20:54:20 -- common/autotest_common.sh@10 -- # set +x 00:07:05.417 ************************************ 00:07:05.417 END TEST accel_fill 00:07:05.417 ************************************ 00:07:05.417 20:54:20 -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:05.417 20:54:20 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:05.417 20:54:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:05.417 20:54:20 -- common/autotest_common.sh@10 -- # set +x 00:07:05.417 ************************************ 00:07:05.417 START TEST accel_copy_crc32c 00:07:05.417 ************************************ 00:07:05.417 20:54:20 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y 00:07:05.417 20:54:20 -- accel/accel.sh@16 -- # local accel_opc 00:07:05.417 20:54:20 -- accel/accel.sh@17 -- # local accel_module 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # IFS=: 00:07:05.417 20:54:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:05.417 20:54:20 -- accel/accel.sh@19 -- # read -r var val 00:07:05.417 20:54:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:05.417 20:54:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.417 20:54:20 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:05.417 20:54:20 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:05.417 20:54:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.417 20:54:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.417 20:54:20 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:05.417 20:54:20 -- accel/accel.sh@40 -- # local IFS=, 00:07:05.417 20:54:20 -- accel/accel.sh@41 -- # jq -r . 00:07:05.417 [2024-04-25 20:54:20.998928] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:05.417 [2024-04-25 20:54:20.999014] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186671 ] 00:07:05.417 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.417 [2024-04-25 20:54:21.037418] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:05.417 [2024-04-25 20:54:21.070452] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.676 [2024-04-25 20:54:21.110467] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=0x1 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=0 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=software 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@22 -- # accel_module=software 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=32 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=32 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=1 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val=Yes 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:05.676 20:54:21 -- accel/accel.sh@20 -- # val= 00:07:05.676 20:54:21 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # IFS=: 00:07:05.676 20:54:21 -- accel/accel.sh@19 -- # read -r var val 00:07:06.612 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:06.612 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.612 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:06.612 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.612 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:06.612 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.612 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:06.612 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.612 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.612 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:06.870 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.870 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.870 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.870 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:06.870 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.870 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.870 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.870 20:54:22 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:06.870 20:54:22 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:06.870 20:54:22 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:06.870 00:07:06.870 real 0m1.295s 00:07:06.870 user 0m1.157s 00:07:06.870 sys 0m0.142s 00:07:06.870 20:54:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:06.870 20:54:22 -- common/autotest_common.sh@10 -- # set +x 00:07:06.870 ************************************ 00:07:06.870 END TEST accel_copy_crc32c 00:07:06.870 ************************************ 00:07:06.870 20:54:22 -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:06.870 20:54:22 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:06.870 20:54:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:06.870 20:54:22 -- common/autotest_common.sh@10 -- # set +x 00:07:06.870 ************************************ 00:07:06.870 START TEST accel_copy_crc32c_C2 00:07:06.870 ************************************ 00:07:06.870 20:54:22 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:06.870 20:54:22 -- accel/accel.sh@16 -- # local accel_opc 00:07:06.870 20:54:22 -- accel/accel.sh@17 -- # local accel_module 00:07:06.870 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:06.870 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:06.870 20:54:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:06.870 20:54:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:06.870 20:54:22 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.870 20:54:22 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:06.870 20:54:22 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:06.870 20:54:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.870 20:54:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.870 20:54:22 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:06.870 20:54:22 -- accel/accel.sh@40 -- # local IFS=, 00:07:06.870 20:54:22 -- accel/accel.sh@41 -- # jq -r . 00:07:06.870 [2024-04-25 20:54:22.497146] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:06.871 [2024-04-25 20:54:22.497232] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid186965 ] 00:07:07.129 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.129 [2024-04-25 20:54:22.536617] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:07.129 [2024-04-25 20:54:22.567624] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.129 [2024-04-25 20:54:22.606781] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=0x1 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=0 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=software 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@22 -- # accel_module=software 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=32 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=32 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=1 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val=Yes 00:07:07.129 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.129 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.129 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.130 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.130 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.130 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:07.130 20:54:22 -- accel/accel.sh@20 -- # val= 00:07:07.130 20:54:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.130 20:54:22 -- accel/accel.sh@19 -- # IFS=: 00:07:07.130 20:54:22 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@20 -- # val= 00:07:08.503 20:54:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@20 -- # val= 00:07:08.503 20:54:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@20 -- # val= 00:07:08.503 20:54:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@20 -- # val= 00:07:08.503 20:54:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@20 -- # val= 00:07:08.503 20:54:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@20 -- # val= 00:07:08.503 20:54:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:08.503 20:54:23 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:08.503 20:54:23 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.503 00:07:08.503 real 0m1.292s 00:07:08.503 user 0m1.162s 00:07:08.503 sys 0m0.134s 00:07:08.503 20:54:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:08.503 20:54:23 -- common/autotest_common.sh@10 -- # set +x 00:07:08.503 ************************************ 00:07:08.503 END TEST accel_copy_crc32c_C2 00:07:08.503 ************************************ 00:07:08.503 20:54:23 -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:08.503 20:54:23 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:08.503 20:54:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:08.503 20:54:23 -- common/autotest_common.sh@10 -- # set +x 00:07:08.503 ************************************ 00:07:08.503 START TEST accel_dualcast 00:07:08.503 ************************************ 00:07:08.503 20:54:23 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dualcast -y 00:07:08.503 20:54:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:08.503 20:54:23 -- accel/accel.sh@17 -- # local accel_module 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # IFS=: 00:07:08.503 20:54:23 -- accel/accel.sh@19 -- # read -r var val 00:07:08.503 20:54:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:08.503 20:54:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:08.503 20:54:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.503 20:54:23 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:08.503 20:54:23 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:08.503 20:54:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.503 20:54:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.503 20:54:23 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:08.503 20:54:23 -- accel/accel.sh@40 -- # local IFS=, 00:07:08.503 20:54:23 -- accel/accel.sh@41 -- # jq -r . 00:07:08.503 [2024-04-25 20:54:23.991628] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:08.503 [2024-04-25 20:54:23.991718] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid187253 ] 00:07:08.503 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.504 [2024-04-25 20:54:24.032485] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:08.504 [2024-04-25 20:54:24.064056] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.504 [2024-04-25 20:54:24.101822] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=0x1 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=dualcast 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=software 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@22 -- # accel_module=software 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=32 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=32 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=1 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val=Yes 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:08.504 20:54:24 -- accel/accel.sh@20 -- # val= 00:07:08.504 20:54:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # IFS=: 00:07:08.504 20:54:24 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:09.879 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:09.879 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:09.879 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:09.879 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:09.879 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:09.879 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:09.879 20:54:25 -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:09.879 20:54:25 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:09.879 00:07:09.879 real 0m1.293s 00:07:09.879 user 0m1.154s 00:07:09.879 sys 0m0.143s 00:07:09.879 20:54:25 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:09.879 20:54:25 -- common/autotest_common.sh@10 -- # set +x 00:07:09.879 ************************************ 00:07:09.879 END TEST accel_dualcast 00:07:09.879 ************************************ 00:07:09.879 20:54:25 -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:09.879 20:54:25 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:09.879 20:54:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:09.879 20:54:25 -- common/autotest_common.sh@10 -- # set +x 00:07:09.879 ************************************ 00:07:09.879 START TEST accel_compare 00:07:09.879 ************************************ 00:07:09.879 20:54:25 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compare -y 00:07:09.879 20:54:25 -- accel/accel.sh@16 -- # local accel_opc 00:07:09.879 20:54:25 -- accel/accel.sh@17 -- # local accel_module 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:09.879 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:09.879 20:54:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:09.879 20:54:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:09.879 20:54:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.879 20:54:25 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:09.879 20:54:25 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:09.879 20:54:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.879 20:54:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.879 20:54:25 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:09.879 20:54:25 -- accel/accel.sh@40 -- # local IFS=, 00:07:09.879 20:54:25 -- accel/accel.sh@41 -- # jq -r . 00:07:09.879 [2024-04-25 20:54:25.481271] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:09.879 [2024-04-25 20:54:25.481351] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid187548 ] 00:07:09.879 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.879 [2024-04-25 20:54:25.518843] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:10.180 [2024-04-25 20:54:25.550780] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.180 [2024-04-25 20:54:25.586997] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=0x1 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=compare 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@23 -- # accel_opc=compare 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=software 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@22 -- # accel_module=software 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=32 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=32 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=1 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val=Yes 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:10.180 20:54:25 -- accel/accel.sh@20 -- # val= 00:07:10.180 20:54:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # IFS=: 00:07:10.180 20:54:25 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@20 -- # val= 00:07:11.162 20:54:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@20 -- # val= 00:07:11.162 20:54:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@20 -- # val= 00:07:11.162 20:54:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@20 -- # val= 00:07:11.162 20:54:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@20 -- # val= 00:07:11.162 20:54:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@20 -- # val= 00:07:11.162 20:54:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.162 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.162 20:54:26 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:11.162 20:54:26 -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:11.162 20:54:26 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:11.162 00:07:11.162 real 0m1.287s 00:07:11.162 user 0m1.163s 00:07:11.162 sys 0m0.128s 00:07:11.162 20:54:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:11.162 20:54:26 -- common/autotest_common.sh@10 -- # set +x 00:07:11.162 ************************************ 00:07:11.162 END TEST accel_compare 00:07:11.162 ************************************ 00:07:11.162 20:54:26 -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:11.162 20:54:26 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:11.162 20:54:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:11.162 20:54:26 -- common/autotest_common.sh@10 -- # set +x 00:07:11.421 ************************************ 00:07:11.421 START TEST accel_xor 00:07:11.421 ************************************ 00:07:11.422 20:54:26 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y 00:07:11.422 20:54:26 -- accel/accel.sh@16 -- # local accel_opc 00:07:11.422 20:54:26 -- accel/accel.sh@17 -- # local accel_module 00:07:11.422 20:54:26 -- accel/accel.sh@19 -- # IFS=: 00:07:11.422 20:54:26 -- accel/accel.sh@19 -- # read -r var val 00:07:11.422 20:54:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:11.422 20:54:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:11.422 20:54:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.422 20:54:26 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:11.422 20:54:26 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:11.422 20:54:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.422 20:54:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.422 20:54:26 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:11.422 20:54:26 -- accel/accel.sh@40 -- # local IFS=, 00:07:11.422 20:54:26 -- accel/accel.sh@41 -- # jq -r . 00:07:11.422 [2024-04-25 20:54:26.967229] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:11.422 [2024-04-25 20:54:26.967308] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid187844 ] 00:07:11.422 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.422 [2024-04-25 20:54:27.005547] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:11.422 [2024-04-25 20:54:27.036293] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.422 [2024-04-25 20:54:27.072686] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=0x1 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=xor 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@23 -- # accel_opc=xor 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=2 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=software 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@22 -- # accel_module=software 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=32 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=32 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=1 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val=Yes 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:11.681 20:54:27 -- accel/accel.sh@20 -- # val= 00:07:11.681 20:54:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # IFS=: 00:07:11.681 20:54:27 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:12.618 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:12.618 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:12.618 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:12.618 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:12.618 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:12.618 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.618 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.618 20:54:28 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:12.618 20:54:28 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:12.618 20:54:28 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.618 00:07:12.618 real 0m1.288s 00:07:12.618 user 0m1.162s 00:07:12.618 sys 0m0.130s 00:07:12.618 20:54:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:12.618 20:54:28 -- common/autotest_common.sh@10 -- # set +x 00:07:12.618 ************************************ 00:07:12.618 END TEST accel_xor 00:07:12.618 ************************************ 00:07:12.618 20:54:28 -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:12.618 20:54:28 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:12.618 20:54:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.618 20:54:28 -- common/autotest_common.sh@10 -- # set +x 00:07:12.877 ************************************ 00:07:12.877 START TEST accel_xor 00:07:12.877 ************************************ 00:07:12.877 20:54:28 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y -x 3 00:07:12.877 20:54:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.877 20:54:28 -- accel/accel.sh@17 -- # local accel_module 00:07:12.877 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:12.877 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:12.877 20:54:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:12.877 20:54:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:12.877 20:54:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.877 20:54:28 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.877 20:54:28 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.877 20:54:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.877 20:54:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.877 20:54:28 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.877 20:54:28 -- accel/accel.sh@40 -- # local IFS=, 00:07:12.877 20:54:28 -- accel/accel.sh@41 -- # jq -r . 00:07:12.877 [2024-04-25 20:54:28.451718] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:12.877 [2024-04-25 20:54:28.451800] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid188118 ] 00:07:12.877 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.877 [2024-04-25 20:54:28.493712] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:12.877 [2024-04-25 20:54:28.524138] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.137 [2024-04-25 20:54:28.561623] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=0x1 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=xor 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@23 -- # accel_opc=xor 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=3 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=software 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@22 -- # accel_module=software 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=32 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=32 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=1 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val=Yes 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:13.137 20:54:28 -- accel/accel.sh@20 -- # val= 00:07:13.137 20:54:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # IFS=: 00:07:13.137 20:54:28 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@20 -- # val= 00:07:14.075 20:54:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@20 -- # val= 00:07:14.075 20:54:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@20 -- # val= 00:07:14.075 20:54:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@20 -- # val= 00:07:14.075 20:54:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@20 -- # val= 00:07:14.075 20:54:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@20 -- # val= 00:07:14.075 20:54:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.075 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.075 20:54:29 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:14.075 20:54:29 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:14.075 20:54:29 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.075 00:07:14.075 real 0m1.292s 00:07:14.075 user 0m1.163s 00:07:14.075 sys 0m0.132s 00:07:14.075 20:54:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:14.075 20:54:29 -- common/autotest_common.sh@10 -- # set +x 00:07:14.075 ************************************ 00:07:14.075 END TEST accel_xor 00:07:14.075 ************************************ 00:07:14.334 20:54:29 -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:14.334 20:54:29 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:14.334 20:54:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:14.334 20:54:29 -- common/autotest_common.sh@10 -- # set +x 00:07:14.334 ************************************ 00:07:14.334 START TEST accel_dif_verify 00:07:14.334 ************************************ 00:07:14.334 20:54:29 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_verify 00:07:14.334 20:54:29 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.334 20:54:29 -- accel/accel.sh@17 -- # local accel_module 00:07:14.334 20:54:29 -- accel/accel.sh@19 -- # IFS=: 00:07:14.334 20:54:29 -- accel/accel.sh@19 -- # read -r var val 00:07:14.334 20:54:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:14.335 20:54:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:14.335 20:54:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.335 20:54:29 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.335 20:54:29 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.335 20:54:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.335 20:54:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.335 20:54:29 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.335 20:54:29 -- accel/accel.sh@40 -- # local IFS=, 00:07:14.335 20:54:29 -- accel/accel.sh@41 -- # jq -r . 00:07:14.335 [2024-04-25 20:54:29.938577] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:14.335 [2024-04-25 20:54:29.938666] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid188388 ] 00:07:14.335 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.335 [2024-04-25 20:54:29.977423] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:14.593 [2024-04-25 20:54:30.009862] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.593 [2024-04-25 20:54:30.048999] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.593 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=0x1 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=dif_verify 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val='512 bytes' 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val='8 bytes' 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=software 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@22 -- # accel_module=software 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=32 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=32 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=1 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val=No 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:14.594 20:54:30 -- accel/accel.sh@20 -- # val= 00:07:14.594 20:54:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # IFS=: 00:07:14.594 20:54:30 -- accel/accel.sh@19 -- # read -r var val 00:07:15.972 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.972 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.972 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.972 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.972 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.972 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.972 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.972 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.972 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.972 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.972 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.972 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.972 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:15.973 20:54:31 -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:15.973 20:54:31 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.973 00:07:15.973 real 0m1.294s 00:07:15.973 user 0m1.170s 00:07:15.973 sys 0m0.128s 00:07:15.973 20:54:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:15.973 20:54:31 -- common/autotest_common.sh@10 -- # set +x 00:07:15.973 ************************************ 00:07:15.973 END TEST accel_dif_verify 00:07:15.973 ************************************ 00:07:15.973 20:54:31 -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:15.973 20:54:31 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:15.973 20:54:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.973 20:54:31 -- common/autotest_common.sh@10 -- # set +x 00:07:15.973 ************************************ 00:07:15.973 START TEST accel_dif_generate 00:07:15.973 ************************************ 00:07:15.973 20:54:31 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate 00:07:15.973 20:54:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.973 20:54:31 -- accel/accel.sh@17 -- # local accel_module 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:15.973 20:54:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:15.973 20:54:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.973 20:54:31 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.973 20:54:31 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.973 20:54:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.973 20:54:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.973 20:54:31 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.973 20:54:31 -- accel/accel.sh@40 -- # local IFS=, 00:07:15.973 20:54:31 -- accel/accel.sh@41 -- # jq -r . 00:07:15.973 [2024-04-25 20:54:31.426623] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:15.973 [2024-04-25 20:54:31.426709] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid188644 ] 00:07:15.973 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.973 [2024-04-25 20:54:31.464798] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:15.973 [2024-04-25 20:54:31.495954] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.973 [2024-04-25 20:54:31.532134] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=0x1 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=dif_generate 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val='512 bytes' 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val='8 bytes' 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=software 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@22 -- # accel_module=software 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=32 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=32 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=1 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val=No 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:15.973 20:54:31 -- accel/accel.sh@20 -- # val= 00:07:15.973 20:54:31 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # IFS=: 00:07:15.973 20:54:31 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@20 -- # val= 00:07:17.351 20:54:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@20 -- # val= 00:07:17.351 20:54:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@20 -- # val= 00:07:17.351 20:54:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@20 -- # val= 00:07:17.351 20:54:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@20 -- # val= 00:07:17.351 20:54:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@20 -- # val= 00:07:17.351 20:54:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:17.351 20:54:32 -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:17.351 20:54:32 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.351 00:07:17.351 real 0m1.287s 00:07:17.351 user 0m1.165s 00:07:17.351 sys 0m0.127s 00:07:17.351 20:54:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:17.351 20:54:32 -- common/autotest_common.sh@10 -- # set +x 00:07:17.351 ************************************ 00:07:17.351 END TEST accel_dif_generate 00:07:17.351 ************************************ 00:07:17.351 20:54:32 -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:17.351 20:54:32 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:17.351 20:54:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:17.351 20:54:32 -- common/autotest_common.sh@10 -- # set +x 00:07:17.351 ************************************ 00:07:17.351 START TEST accel_dif_generate_copy 00:07:17.351 ************************************ 00:07:17.351 20:54:32 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate_copy 00:07:17.351 20:54:32 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.351 20:54:32 -- accel/accel.sh@17 -- # local accel_module 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # IFS=: 00:07:17.351 20:54:32 -- accel/accel.sh@19 -- # read -r var val 00:07:17.351 20:54:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:17.351 20:54:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.351 20:54:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:17.351 20:54:32 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:17.351 20:54:32 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:17.351 20:54:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.351 20:54:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.351 20:54:32 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:17.351 20:54:32 -- accel/accel.sh@40 -- # local IFS=, 00:07:17.351 20:54:32 -- accel/accel.sh@41 -- # jq -r . 00:07:17.351 [2024-04-25 20:54:32.909422] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:17.351 [2024-04-25 20:54:32.909502] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid188908 ] 00:07:17.351 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.351 [2024-04-25 20:54:32.948569] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:17.351 [2024-04-25 20:54:32.979893] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.609 [2024-04-25 20:54:33.017677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=0x1 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=software 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@22 -- # accel_module=software 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=32 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=32 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=1 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val=No 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:17.609 20:54:33 -- accel/accel.sh@20 -- # val= 00:07:17.609 20:54:33 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # IFS=: 00:07:17.609 20:54:33 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:18.545 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:18.545 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:18.545 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:18.545 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:18.545 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:18.545 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.545 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.545 20:54:34 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:18.545 20:54:34 -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:18.545 20:54:34 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.545 00:07:18.545 real 0m1.291s 00:07:18.545 user 0m1.162s 00:07:18.545 sys 0m0.134s 00:07:18.545 20:54:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:18.545 20:54:34 -- common/autotest_common.sh@10 -- # set +x 00:07:18.545 ************************************ 00:07:18.545 END TEST accel_dif_generate_copy 00:07:18.545 ************************************ 00:07:18.803 20:54:34 -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:18.803 20:54:34 -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.803 20:54:34 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:18.803 20:54:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:18.803 20:54:34 -- common/autotest_common.sh@10 -- # set +x 00:07:18.803 ************************************ 00:07:18.803 START TEST accel_comp 00:07:18.803 ************************************ 00:07:18.803 20:54:34 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.803 20:54:34 -- accel/accel.sh@16 -- # local accel_opc 00:07:18.803 20:54:34 -- accel/accel.sh@17 -- # local accel_module 00:07:18.803 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:18.803 20:54:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.803 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:18.803 20:54:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.803 20:54:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.803 20:54:34 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:18.803 20:54:34 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:18.803 20:54:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.803 20:54:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.803 20:54:34 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:18.804 20:54:34 -- accel/accel.sh@40 -- # local IFS=, 00:07:18.804 20:54:34 -- accel/accel.sh@41 -- # jq -r . 00:07:18.804 [2024-04-25 20:54:34.397050] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:18.804 [2024-04-25 20:54:34.397126] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid189179 ] 00:07:18.804 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.804 [2024-04-25 20:54:34.436821] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:19.063 [2024-04-25 20:54:34.471070] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.063 [2024-04-25 20:54:34.507450] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.063 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.063 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.063 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.063 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.063 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=0x1 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=compress 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@23 -- # accel_opc=compress 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=software 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@22 -- # accel_module=software 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=32 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=32 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=1 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val=No 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:19.064 20:54:34 -- accel/accel.sh@20 -- # val= 00:07:19.064 20:54:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # IFS=: 00:07:19.064 20:54:34 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:20.444 20:54:35 -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:20.444 20:54:35 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.444 00:07:20.444 real 0m1.294s 00:07:20.444 user 0m1.165s 00:07:20.444 sys 0m0.134s 00:07:20.444 20:54:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:20.444 20:54:35 -- common/autotest_common.sh@10 -- # set +x 00:07:20.444 ************************************ 00:07:20.444 END TEST accel_comp 00:07:20.444 ************************************ 00:07:20.444 20:54:35 -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 20:54:35 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:20.444 20:54:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:20.444 20:54:35 -- common/autotest_common.sh@10 -- # set +x 00:07:20.444 ************************************ 00:07:20.444 START TEST accel_decomp 00:07:20.444 ************************************ 00:07:20.444 20:54:35 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 20:54:35 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.444 20:54:35 -- accel/accel.sh@17 -- # local accel_module 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:35 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 20:54:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 20:54:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.444 20:54:35 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:20.444 20:54:35 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:20.444 20:54:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.444 20:54:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.444 20:54:35 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:20.444 20:54:35 -- accel/accel.sh@40 -- # local IFS=, 00:07:20.444 20:54:35 -- accel/accel.sh@41 -- # jq -r . 00:07:20.444 [2024-04-25 20:54:35.890274] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:20.444 [2024-04-25 20:54:35.890354] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid189446 ] 00:07:20.444 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.444 [2024-04-25 20:54:35.929565] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:20.444 [2024-04-25 20:54:35.960964] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.444 [2024-04-25 20:54:35.996915] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=0x1 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=decompress 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=software 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@22 -- # accel_module=software 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=32 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=32 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=1 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val=Yes 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.444 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.444 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:20.444 20:54:36 -- accel/accel.sh@20 -- # val= 00:07:20.445 20:54:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.445 20:54:36 -- accel/accel.sh@19 -- # IFS=: 00:07:20.445 20:54:36 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:21.820 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:21.820 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:21.820 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:21.820 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:21.820 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:21.820 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:21.820 20:54:37 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:21.820 20:54:37 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:21.820 00:07:21.820 real 0m1.292s 00:07:21.820 user 0m1.169s 00:07:21.820 sys 0m0.128s 00:07:21.820 20:54:37 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:21.820 20:54:37 -- common/autotest_common.sh@10 -- # set +x 00:07:21.820 ************************************ 00:07:21.820 END TEST accel_decomp 00:07:21.820 ************************************ 00:07:21.820 20:54:37 -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.820 20:54:37 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:21.820 20:54:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.820 20:54:37 -- common/autotest_common.sh@10 -- # set +x 00:07:21.820 ************************************ 00:07:21.820 START TEST accel_decmop_full 00:07:21.820 ************************************ 00:07:21.820 20:54:37 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.820 20:54:37 -- accel/accel.sh@16 -- # local accel_opc 00:07:21.820 20:54:37 -- accel/accel.sh@17 -- # local accel_module 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:21.820 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:21.820 20:54:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.820 20:54:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:21.820 20:54:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.820 20:54:37 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:21.820 20:54:37 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:21.820 20:54:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.820 20:54:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.820 20:54:37 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:21.820 20:54:37 -- accel/accel.sh@40 -- # local IFS=, 00:07:21.820 20:54:37 -- accel/accel.sh@41 -- # jq -r . 00:07:21.820 [2024-04-25 20:54:37.391544] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:21.820 [2024-04-25 20:54:37.391639] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid189708 ] 00:07:21.820 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.820 [2024-04-25 20:54:37.430060] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:21.820 [2024-04-25 20:54:37.460879] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.079 [2024-04-25 20:54:37.500064] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=0x1 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=decompress 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=software 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@22 -- # accel_module=software 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=32 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=32 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=1 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val=Yes 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:22.079 20:54:37 -- accel/accel.sh@20 -- # val= 00:07:22.079 20:54:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # IFS=: 00:07:22.079 20:54:37 -- accel/accel.sh@19 -- # read -r var val 00:07:23.013 20:54:38 -- accel/accel.sh@20 -- # val= 00:07:23.013 20:54:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.013 20:54:38 -- accel/accel.sh@20 -- # val= 00:07:23.013 20:54:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.013 20:54:38 -- accel/accel.sh@20 -- # val= 00:07:23.013 20:54:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.013 20:54:38 -- accel/accel.sh@20 -- # val= 00:07:23.013 20:54:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.013 20:54:38 -- accel/accel.sh@20 -- # val= 00:07:23.013 20:54:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.013 20:54:38 -- accel/accel.sh@20 -- # val= 00:07:23.013 20:54:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.013 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.014 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.014 20:54:38 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:23.014 20:54:38 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:23.014 20:54:38 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.014 00:07:23.014 real 0m1.297s 00:07:23.014 user 0m1.172s 00:07:23.014 sys 0m0.130s 00:07:23.014 20:54:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:23.014 20:54:38 -- common/autotest_common.sh@10 -- # set +x 00:07:23.014 ************************************ 00:07:23.014 END TEST accel_decmop_full 00:07:23.014 ************************************ 00:07:23.273 20:54:38 -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.273 20:54:38 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:23.273 20:54:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.273 20:54:38 -- common/autotest_common.sh@10 -- # set +x 00:07:23.273 ************************************ 00:07:23.273 START TEST accel_decomp_mcore 00:07:23.273 ************************************ 00:07:23.273 20:54:38 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.273 20:54:38 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.273 20:54:38 -- accel/accel.sh@17 -- # local accel_module 00:07:23.273 20:54:38 -- accel/accel.sh@19 -- # IFS=: 00:07:23.273 20:54:38 -- accel/accel.sh@19 -- # read -r var val 00:07:23.273 20:54:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.273 20:54:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:23.273 20:54:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.273 20:54:38 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.273 20:54:38 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.273 20:54:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.273 20:54:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.273 20:54:38 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:23.273 20:54:38 -- accel/accel.sh@40 -- # local IFS=, 00:07:23.273 20:54:38 -- accel/accel.sh@41 -- # jq -r . 00:07:23.273 [2024-04-25 20:54:38.898403] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:23.273 [2024-04-25 20:54:38.898482] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190007 ] 00:07:23.273 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.532 [2024-04-25 20:54:38.936532] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:23.532 [2024-04-25 20:54:38.968555] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:23.532 [2024-04-25 20:54:39.007165] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:23.532 [2024-04-25 20:54:39.007262] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:23.532 [2024-04-25 20:54:39.007345] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:23.532 [2024-04-25 20:54:39.007347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=0xf 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=decompress 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=software 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@22 -- # accel_module=software 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=32 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=32 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=1 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val=Yes 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:23.532 20:54:39 -- accel/accel.sh@20 -- # val= 00:07:23.532 20:54:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # IFS=: 00:07:23.532 20:54:39 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.909 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.909 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.909 20:54:40 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:24.909 20:54:40 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:24.909 20:54:40 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:24.909 00:07:24.909 real 0m1.308s 00:07:24.909 user 0m4.500s 00:07:24.909 sys 0m0.149s 00:07:24.909 20:54:40 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:24.909 20:54:40 -- common/autotest_common.sh@10 -- # set +x 00:07:24.909 ************************************ 00:07:24.909 END TEST accel_decomp_mcore 00:07:24.909 ************************************ 00:07:24.909 20:54:40 -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.910 20:54:40 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:24.910 20:54:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:24.910 20:54:40 -- common/autotest_common.sh@10 -- # set +x 00:07:24.910 ************************************ 00:07:24.910 START TEST accel_decomp_full_mcore 00:07:24.910 ************************************ 00:07:24.910 20:54:40 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.910 20:54:40 -- accel/accel.sh@16 -- # local accel_opc 00:07:24.910 20:54:40 -- accel/accel.sh@17 -- # local accel_module 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.910 20:54:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:24.910 20:54:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.910 20:54:40 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:24.910 20:54:40 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:24.910 20:54:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.910 20:54:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.910 20:54:40 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:24.910 20:54:40 -- accel/accel.sh@40 -- # local IFS=, 00:07:24.910 20:54:40 -- accel/accel.sh@41 -- # jq -r . 00:07:24.910 [2024-04-25 20:54:40.404392] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:24.910 [2024-04-25 20:54:40.404479] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190282 ] 00:07:24.910 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.910 [2024-04-25 20:54:40.444148] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:24.910 [2024-04-25 20:54:40.476355] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:24.910 [2024-04-25 20:54:40.519919] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.910 [2024-04-25 20:54:40.520021] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.910 [2024-04-25 20:54:40.520052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:24.910 [2024-04-25 20:54:40.520054] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val=0xf 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val=decompress 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:24.910 20:54:40 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:24.910 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:24.910 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val=software 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@22 -- # accel_module=software 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val=32 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val=32 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val=1 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val=Yes 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:25.169 20:54:40 -- accel/accel.sh@20 -- # val= 00:07:25.169 20:54:40 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # IFS=: 00:07:25.169 20:54:40 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@20 -- # val= 00:07:26.107 20:54:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.107 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.107 20:54:41 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:26.107 20:54:41 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:26.107 20:54:41 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.107 00:07:26.107 real 0m1.323s 00:07:26.107 user 0m4.538s 00:07:26.107 sys 0m0.149s 00:07:26.107 20:54:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:26.107 20:54:41 -- common/autotest_common.sh@10 -- # set +x 00:07:26.107 ************************************ 00:07:26.107 END TEST accel_decomp_full_mcore 00:07:26.107 ************************************ 00:07:26.107 20:54:41 -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.107 20:54:41 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:26.107 20:54:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:26.107 20:54:41 -- common/autotest_common.sh@10 -- # set +x 00:07:26.366 ************************************ 00:07:26.366 START TEST accel_decomp_mthread 00:07:26.366 ************************************ 00:07:26.366 20:54:41 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.366 20:54:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:26.366 20:54:41 -- accel/accel.sh@17 -- # local accel_module 00:07:26.366 20:54:41 -- accel/accel.sh@19 -- # IFS=: 00:07:26.366 20:54:41 -- accel/accel.sh@19 -- # read -r var val 00:07:26.366 20:54:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.366 20:54:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:26.366 20:54:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.366 20:54:41 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.366 20:54:41 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.366 20:54:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.366 20:54:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.366 20:54:41 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:26.366 20:54:41 -- accel/accel.sh@40 -- # local IFS=, 00:07:26.366 20:54:41 -- accel/accel.sh@41 -- # jq -r . 00:07:26.366 [2024-04-25 20:54:41.930700] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:26.366 [2024-04-25 20:54:41.930781] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190573 ] 00:07:26.366 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.366 [2024-04-25 20:54:41.969949] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:26.366 [2024-04-25 20:54:42.002557] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.625 [2024-04-25 20:54:42.039142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.625 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.625 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.625 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.625 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.625 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.625 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.625 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.625 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.625 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.625 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.625 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.625 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.625 20:54:42 -- accel/accel.sh@20 -- # val=0x1 00:07:26.625 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=decompress 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=software 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@22 -- # accel_module=software 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=32 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=32 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=2 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val=Yes 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:26.626 20:54:42 -- accel/accel.sh@20 -- # val= 00:07:26.626 20:54:42 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # IFS=: 00:07:26.626 20:54:42 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:27.563 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.563 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.563 20:54:43 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:27.563 20:54:43 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:27.563 20:54:43 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:27.563 00:07:27.563 real 0m1.302s 00:07:27.563 user 0m1.187s 00:07:27.563 sys 0m0.131s 00:07:27.563 20:54:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:27.563 20:54:43 -- common/autotest_common.sh@10 -- # set +x 00:07:27.563 ************************************ 00:07:27.563 END TEST accel_decomp_mthread 00:07:27.563 ************************************ 00:07:27.821 20:54:43 -- accel/accel.sh@122 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.821 20:54:43 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:27.821 20:54:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:27.821 20:54:43 -- common/autotest_common.sh@10 -- # set +x 00:07:27.821 ************************************ 00:07:27.821 START TEST accel_deomp_full_mthread 00:07:27.821 ************************************ 00:07:27.821 20:54:43 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.821 20:54:43 -- accel/accel.sh@16 -- # local accel_opc 00:07:27.821 20:54:43 -- accel/accel.sh@17 -- # local accel_module 00:07:27.821 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:27.821 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:27.821 20:54:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.821 20:54:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.821 20:54:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:27.821 20:54:43 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:27.821 20:54:43 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:27.821 20:54:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.821 20:54:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.821 20:54:43 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:27.821 20:54:43 -- accel/accel.sh@40 -- # local IFS=, 00:07:27.821 20:54:43 -- accel/accel.sh@41 -- # jq -r . 00:07:27.821 [2024-04-25 20:54:43.439409] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:27.821 [2024-04-25 20:54:43.439494] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid190854 ] 00:07:27.821 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.821 [2024-04-25 20:54:43.477360] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:28.080 [2024-04-25 20:54:43.508688] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.080 [2024-04-25 20:54:43.544891] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=0x1 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=decompress 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=software 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@22 -- # accel_module=software 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=32 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=32 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val=2 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.080 20:54:43 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.080 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.080 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.081 20:54:43 -- accel/accel.sh@20 -- # val=Yes 00:07:28.081 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.081 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.081 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:28.081 20:54:43 -- accel/accel.sh@20 -- # val= 00:07:28.081 20:54:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # IFS=: 00:07:28.081 20:54:43 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@20 -- # val= 00:07:29.494 20:54:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # IFS=: 00:07:29.494 20:54:44 -- accel/accel.sh@19 -- # read -r var val 00:07:29.494 20:54:44 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:29.494 20:54:44 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:29.494 20:54:44 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.494 00:07:29.494 real 0m1.321s 00:07:29.494 user 0m1.215s 00:07:29.494 sys 0m0.122s 00:07:29.494 20:54:44 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:29.494 20:54:44 -- common/autotest_common.sh@10 -- # set +x 00:07:29.494 ************************************ 00:07:29.494 END TEST accel_deomp_full_mthread 00:07:29.494 ************************************ 00:07:29.494 20:54:44 -- accel/accel.sh@124 -- # [[ n == y ]] 00:07:29.494 20:54:44 -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:29.494 20:54:44 -- accel/accel.sh@137 -- # build_accel_config 00:07:29.494 20:54:44 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:29.494 20:54:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.494 20:54:44 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:29.494 20:54:44 -- common/autotest_common.sh@10 -- # set +x 00:07:29.494 20:54:44 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:29.494 20:54:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.494 20:54:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.494 20:54:44 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:29.494 20:54:44 -- accel/accel.sh@40 -- # local IFS=, 00:07:29.494 20:54:44 -- accel/accel.sh@41 -- # jq -r . 00:07:29.494 ************************************ 00:07:29.494 START TEST accel_dif_functional_tests 00:07:29.494 ************************************ 00:07:29.494 20:54:44 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:29.494 [2024-04-25 20:54:44.976144] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:29.494 [2024-04-25 20:54:44.976228] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191140 ] 00:07:29.494 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.494 [2024-04-25 20:54:45.015495] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:29.494 [2024-04-25 20:54:45.048905] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:29.495 [2024-04-25 20:54:45.091623] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:29.495 [2024-04-25 20:54:45.091721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:29.495 [2024-04-25 20:54:45.091723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.495 00:07:29.495 00:07:29.495 CUnit - A unit testing framework for C - Version 2.1-3 00:07:29.495 http://cunit.sourceforge.net/ 00:07:29.495 00:07:29.495 00:07:29.495 Suite: accel_dif 00:07:29.495 Test: verify: DIF generated, GUARD check ...passed 00:07:29.495 Test: verify: DIF generated, APPTAG check ...passed 00:07:29.495 Test: verify: DIF generated, REFTAG check ...passed 00:07:29.495 Test: verify: DIF not generated, GUARD check ...[2024-04-25 20:54:45.156102] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:29.495 [2024-04-25 20:54:45.156147] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:29.495 passed 00:07:29.495 Test: verify: DIF not generated, APPTAG check ...[2024-04-25 20:54:45.156181] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:29.495 [2024-04-25 20:54:45.156204] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:29.495 passed 00:07:29.495 Test: verify: DIF not generated, REFTAG check ...[2024-04-25 20:54:45.156227] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:29.495 [2024-04-25 20:54:45.156246] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:29.495 passed 00:07:29.495 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:29.495 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-25 20:54:45.156291] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:29.495 passed 00:07:29.495 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:29.495 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:29.495 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:29.495 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-25 20:54:45.156393] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:29.495 passed 00:07:29.495 Test: generate copy: DIF generated, GUARD check ...passed 00:07:29.495 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:29.495 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:29.495 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:29.495 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:29.495 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:29.495 Test: generate copy: iovecs-len validate ...[2024-04-25 20:54:45.156565] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:29.495 passed 00:07:29.495 Test: generate copy: buffer alignment validate ...passed 00:07:29.495 00:07:29.755 Run Summary: Type Total Ran Passed Failed Inactive 00:07:29.755 suites 1 1 n/a 0 0 00:07:29.755 tests 20 20 20 0 0 00:07:29.755 asserts 204 204 204 0 n/a 00:07:29.755 00:07:29.755 Elapsed time = 0.002 seconds 00:07:29.755 00:07:29.755 real 0m0.356s 00:07:29.755 user 0m0.498s 00:07:29.755 sys 0m0.162s 00:07:29.755 20:54:45 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:29.755 20:54:45 -- common/autotest_common.sh@10 -- # set +x 00:07:29.755 ************************************ 00:07:29.755 END TEST accel_dif_functional_tests 00:07:29.755 ************************************ 00:07:29.755 00:07:29.755 real 0m32.984s 00:07:29.755 user 0m33.649s 00:07:29.755 sys 0m6.560s 00:07:29.755 20:54:45 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:29.755 20:54:45 -- common/autotest_common.sh@10 -- # set +x 00:07:29.755 ************************************ 00:07:29.755 END TEST accel 00:07:29.755 ************************************ 00:07:29.755 20:54:45 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:29.755 20:54:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.755 20:54:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.755 20:54:45 -- common/autotest_common.sh@10 -- # set +x 00:07:30.014 ************************************ 00:07:30.014 START TEST accel_rpc 00:07:30.014 ************************************ 00:07:30.014 20:54:45 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:30.014 * Looking for test storage... 00:07:30.014 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:30.014 20:54:45 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:30.014 20:54:45 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=191432 00:07:30.014 20:54:45 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:30.014 20:54:45 -- accel/accel_rpc.sh@15 -- # waitforlisten 191432 00:07:30.014 20:54:45 -- common/autotest_common.sh@817 -- # '[' -z 191432 ']' 00:07:30.014 20:54:45 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:30.014 20:54:45 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:30.014 20:54:45 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:30.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:30.014 20:54:45 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:30.014 20:54:45 -- common/autotest_common.sh@10 -- # set +x 00:07:30.273 [2024-04-25 20:54:45.692107] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:30.273 [2024-04-25 20:54:45.692166] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191432 ] 00:07:30.273 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.273 [2024-04-25 20:54:45.728484] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:30.273 [2024-04-25 20:54:45.759876] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.273 [2024-04-25 20:54:45.796068] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.273 20:54:45 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:30.273 20:54:45 -- common/autotest_common.sh@850 -- # return 0 00:07:30.273 20:54:45 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:30.273 20:54:45 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:30.273 20:54:45 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:30.273 20:54:45 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:30.273 20:54:45 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:30.273 20:54:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:30.273 20:54:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.273 20:54:45 -- common/autotest_common.sh@10 -- # set +x 00:07:30.534 ************************************ 00:07:30.534 START TEST accel_assign_opcode 00:07:30.534 ************************************ 00:07:30.534 20:54:45 -- common/autotest_common.sh@1111 -- # accel_assign_opcode_test_suite 00:07:30.534 20:54:45 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:30.534 20:54:45 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:30.534 20:54:45 -- common/autotest_common.sh@10 -- # set +x 00:07:30.534 [2024-04-25 20:54:46.000908] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:30.534 20:54:46 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:30.534 20:54:46 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:30.534 20:54:46 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:30.534 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:30.534 [2024-04-25 20:54:46.008920] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:30.534 20:54:46 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:30.534 20:54:46 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:30.534 20:54:46 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:30.534 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:30.534 20:54:46 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:30.534 20:54:46 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:30.534 20:54:46 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:30.534 20:54:46 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:30.534 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:30.534 20:54:46 -- accel/accel_rpc.sh@42 -- # grep software 00:07:30.534 20:54:46 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:30.793 software 00:07:30.793 00:07:30.793 real 0m0.224s 00:07:30.793 user 0m0.049s 00:07:30.793 sys 0m0.011s 00:07:30.793 20:54:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:30.793 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:30.793 ************************************ 00:07:30.793 END TEST accel_assign_opcode 00:07:30.793 ************************************ 00:07:30.793 20:54:46 -- accel/accel_rpc.sh@55 -- # killprocess 191432 00:07:30.793 20:54:46 -- common/autotest_common.sh@936 -- # '[' -z 191432 ']' 00:07:30.793 20:54:46 -- common/autotest_common.sh@940 -- # kill -0 191432 00:07:30.793 20:54:46 -- common/autotest_common.sh@941 -- # uname 00:07:30.793 20:54:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:30.793 20:54:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 191432 00:07:30.793 20:54:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:30.793 20:54:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:30.793 20:54:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 191432' 00:07:30.793 killing process with pid 191432 00:07:30.793 20:54:46 -- common/autotest_common.sh@955 -- # kill 191432 00:07:30.793 20:54:46 -- common/autotest_common.sh@960 -- # wait 191432 00:07:31.052 00:07:31.052 real 0m1.045s 00:07:31.052 user 0m0.999s 00:07:31.052 sys 0m0.498s 00:07:31.052 20:54:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:31.052 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:31.052 ************************************ 00:07:31.052 END TEST accel_rpc 00:07:31.052 ************************************ 00:07:31.052 20:54:46 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:31.052 20:54:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:31.052 20:54:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:31.052 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:31.311 ************************************ 00:07:31.311 START TEST app_cmdline 00:07:31.311 ************************************ 00:07:31.311 20:54:46 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:31.311 * Looking for test storage... 00:07:31.311 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:31.311 20:54:46 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:31.311 20:54:46 -- app/cmdline.sh@17 -- # spdk_tgt_pid=191759 00:07:31.311 20:54:46 -- app/cmdline.sh@18 -- # waitforlisten 191759 00:07:31.311 20:54:46 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:31.311 20:54:46 -- common/autotest_common.sh@817 -- # '[' -z 191759 ']' 00:07:31.311 20:54:46 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.311 20:54:46 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:31.311 20:54:46 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.311 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.311 20:54:46 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:31.311 20:54:46 -- common/autotest_common.sh@10 -- # set +x 00:07:31.311 [2024-04-25 20:54:46.909621] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:31.311 [2024-04-25 20:54:46.909710] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid191759 ] 00:07:31.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.311 [2024-04-25 20:54:46.946569] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:31.570 [2024-04-25 20:54:46.978884] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.570 [2024-04-25 20:54:47.015778] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.570 20:54:47 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:31.570 20:54:47 -- common/autotest_common.sh@850 -- # return 0 00:07:31.570 20:54:47 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:31.829 { 00:07:31.829 "version": "SPDK v24.05-pre git sha1 06472fb6d", 00:07:31.829 "fields": { 00:07:31.829 "major": 24, 00:07:31.829 "minor": 5, 00:07:31.829 "patch": 0, 00:07:31.829 "suffix": "-pre", 00:07:31.829 "commit": "06472fb6d" 00:07:31.829 } 00:07:31.829 } 00:07:31.829 20:54:47 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:31.829 20:54:47 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:31.829 20:54:47 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:31.829 20:54:47 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:31.829 20:54:47 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:31.829 20:54:47 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:31.829 20:54:47 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:31.829 20:54:47 -- common/autotest_common.sh@10 -- # set +x 00:07:31.829 20:54:47 -- app/cmdline.sh@26 -- # sort 00:07:31.829 20:54:47 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:31.829 20:54:47 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:31.829 20:54:47 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:31.829 20:54:47 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.829 20:54:47 -- common/autotest_common.sh@638 -- # local es=0 00:07:31.829 20:54:47 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.829 20:54:47 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.829 20:54:47 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:31.829 20:54:47 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.829 20:54:47 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:31.829 20:54:47 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.829 20:54:47 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:31.829 20:54:47 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.829 20:54:47 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:31.829 20:54:47 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:32.088 request: 00:07:32.088 { 00:07:32.088 "method": "env_dpdk_get_mem_stats", 00:07:32.088 "req_id": 1 00:07:32.088 } 00:07:32.088 Got JSON-RPC error response 00:07:32.088 response: 00:07:32.088 { 00:07:32.088 "code": -32601, 00:07:32.088 "message": "Method not found" 00:07:32.088 } 00:07:32.088 20:54:47 -- common/autotest_common.sh@641 -- # es=1 00:07:32.088 20:54:47 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:32.088 20:54:47 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:32.088 20:54:47 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:32.088 20:54:47 -- app/cmdline.sh@1 -- # killprocess 191759 00:07:32.088 20:54:47 -- common/autotest_common.sh@936 -- # '[' -z 191759 ']' 00:07:32.088 20:54:47 -- common/autotest_common.sh@940 -- # kill -0 191759 00:07:32.088 20:54:47 -- common/autotest_common.sh@941 -- # uname 00:07:32.088 20:54:47 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:32.088 20:54:47 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 191759 00:07:32.088 20:54:47 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:32.088 20:54:47 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:32.088 20:54:47 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 191759' 00:07:32.088 killing process with pid 191759 00:07:32.088 20:54:47 -- common/autotest_common.sh@955 -- # kill 191759 00:07:32.088 20:54:47 -- common/autotest_common.sh@960 -- # wait 191759 00:07:32.348 00:07:32.348 real 0m1.159s 00:07:32.348 user 0m1.299s 00:07:32.348 sys 0m0.466s 00:07:32.348 20:54:47 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:32.348 20:54:47 -- common/autotest_common.sh@10 -- # set +x 00:07:32.348 ************************************ 00:07:32.348 END TEST app_cmdline 00:07:32.348 ************************************ 00:07:32.348 20:54:47 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:32.348 20:54:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.348 20:54:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.348 20:54:47 -- common/autotest_common.sh@10 -- # set +x 00:07:32.607 ************************************ 00:07:32.607 START TEST version 00:07:32.607 ************************************ 00:07:32.607 20:54:48 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:32.607 * Looking for test storage... 00:07:32.607 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:32.607 20:54:48 -- app/version.sh@17 -- # get_header_version major 00:07:32.608 20:54:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.608 20:54:48 -- app/version.sh@14 -- # tr -d '"' 00:07:32.608 20:54:48 -- app/version.sh@14 -- # cut -f2 00:07:32.608 20:54:48 -- app/version.sh@17 -- # major=24 00:07:32.608 20:54:48 -- app/version.sh@18 -- # get_header_version minor 00:07:32.608 20:54:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.608 20:54:48 -- app/version.sh@14 -- # cut -f2 00:07:32.608 20:54:48 -- app/version.sh@14 -- # tr -d '"' 00:07:32.608 20:54:48 -- app/version.sh@18 -- # minor=5 00:07:32.608 20:54:48 -- app/version.sh@19 -- # get_header_version patch 00:07:32.608 20:54:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.608 20:54:48 -- app/version.sh@14 -- # cut -f2 00:07:32.608 20:54:48 -- app/version.sh@14 -- # tr -d '"' 00:07:32.867 20:54:48 -- app/version.sh@19 -- # patch=0 00:07:32.867 20:54:48 -- app/version.sh@20 -- # get_header_version suffix 00:07:32.867 20:54:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.867 20:54:48 -- app/version.sh@14 -- # cut -f2 00:07:32.867 20:54:48 -- app/version.sh@14 -- # tr -d '"' 00:07:32.867 20:54:48 -- app/version.sh@20 -- # suffix=-pre 00:07:32.867 20:54:48 -- app/version.sh@22 -- # version=24.5 00:07:32.867 20:54:48 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:32.867 20:54:48 -- app/version.sh@28 -- # version=24.5rc0 00:07:32.867 20:54:48 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:32.867 20:54:48 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:32.867 20:54:48 -- app/version.sh@30 -- # py_version=24.5rc0 00:07:32.867 20:54:48 -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:07:32.867 00:07:32.867 real 0m0.196s 00:07:32.867 user 0m0.091s 00:07:32.867 sys 0m0.147s 00:07:32.867 20:54:48 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:32.867 20:54:48 -- common/autotest_common.sh@10 -- # set +x 00:07:32.867 ************************************ 00:07:32.867 END TEST version 00:07:32.867 ************************************ 00:07:32.867 20:54:48 -- spdk/autotest.sh@184 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@194 -- # uname -s 00:07:32.867 20:54:48 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:32.867 20:54:48 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:32.867 20:54:48 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:32.867 20:54:48 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@254 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@258 -- # timing_exit lib 00:07:32.867 20:54:48 -- common/autotest_common.sh@716 -- # xtrace_disable 00:07:32.867 20:54:48 -- common/autotest_common.sh@10 -- # set +x 00:07:32.867 20:54:48 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@268 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@277 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@310 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@314 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@328 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@341 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@345 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@354 -- # '[' 0 -eq 1 ']' 00:07:32.867 20:54:48 -- spdk/autotest.sh@361 -- # [[ 0 -eq 1 ]] 00:07:32.867 20:54:48 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:07:32.867 20:54:48 -- spdk/autotest.sh@369 -- # [[ 1 -eq 1 ]] 00:07:32.867 20:54:48 -- spdk/autotest.sh@370 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:32.867 20:54:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.867 20:54:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.867 20:54:48 -- common/autotest_common.sh@10 -- # set +x 00:07:33.126 ************************************ 00:07:33.126 START TEST llvm_fuzz 00:07:33.126 ************************************ 00:07:33.126 20:54:48 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:33.126 * Looking for test storage... 00:07:33.126 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:33.126 20:54:48 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:33.126 20:54:48 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:33.126 20:54:48 -- common/autotest_common.sh@536 -- # fuzzers=() 00:07:33.126 20:54:48 -- common/autotest_common.sh@536 -- # local fuzzers 00:07:33.126 20:54:48 -- common/autotest_common.sh@538 -- # [[ -n '' ]] 00:07:33.126 20:54:48 -- common/autotest_common.sh@541 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:33.126 20:54:48 -- common/autotest_common.sh@542 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:33.126 20:54:48 -- common/autotest_common.sh@545 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:33.126 20:54:48 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:33.126 20:54:48 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:33.126 20:54:48 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:33.126 20:54:48 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:33.126 20:54:48 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:33.126 20:54:48 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:33.126 20:54:48 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:33.126 20:54:48 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:33.126 20:54:48 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:33.126 20:54:48 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:33.126 20:54:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:33.126 20:54:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:33.126 20:54:48 -- common/autotest_common.sh@10 -- # set +x 00:07:33.388 ************************************ 00:07:33.388 START TEST nvmf_fuzz 00:07:33.388 ************************************ 00:07:33.388 20:54:48 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:33.388 * Looking for test storage... 00:07:33.388 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.388 20:54:48 -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:33.388 20:54:48 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:33.388 20:54:48 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:33.388 20:54:48 -- common/autotest_common.sh@34 -- # set -e 00:07:33.388 20:54:48 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:33.388 20:54:48 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:33.388 20:54:48 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:33.388 20:54:48 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:33.388 20:54:48 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:33.388 20:54:48 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:33.388 20:54:48 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:33.388 20:54:48 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:33.388 20:54:48 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:33.388 20:54:48 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:33.388 20:54:48 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:33.388 20:54:48 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:33.388 20:54:48 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:33.388 20:54:48 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:33.388 20:54:48 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:33.388 20:54:48 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:33.388 20:54:48 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:33.388 20:54:48 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:33.388 20:54:48 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:33.388 20:54:48 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:33.388 20:54:48 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:33.388 20:54:48 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:33.388 20:54:48 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:33.388 20:54:48 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:33.388 20:54:48 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:33.388 20:54:48 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:33.388 20:54:48 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:33.388 20:54:48 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:33.388 20:54:48 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:33.388 20:54:48 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:33.388 20:54:48 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:33.388 20:54:48 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:33.388 20:54:48 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:33.388 20:54:48 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:33.388 20:54:48 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:33.388 20:54:48 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:33.388 20:54:48 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:33.388 20:54:48 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:33.388 20:54:48 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:33.388 20:54:48 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:33.388 20:54:48 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:33.388 20:54:48 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:33.388 20:54:48 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:33.388 20:54:48 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:33.388 20:54:48 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:33.388 20:54:48 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:33.388 20:54:48 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:33.388 20:54:48 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:33.388 20:54:48 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:33.388 20:54:48 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:33.388 20:54:48 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:33.388 20:54:48 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:33.388 20:54:48 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:33.388 20:54:48 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:33.388 20:54:48 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:33.388 20:54:48 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:33.388 20:54:48 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:33.388 20:54:48 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:07:33.388 20:54:48 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:07:33.388 20:54:48 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:07:33.388 20:54:48 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:07:33.388 20:54:48 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:07:33.388 20:54:48 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:07:33.388 20:54:48 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:07:33.388 20:54:48 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:07:33.388 20:54:48 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:07:33.388 20:54:48 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.388 20:54:48 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:07:33.388 20:54:48 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:07:33.388 20:54:48 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:07:33.388 20:54:48 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:07:33.388 20:54:48 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:07:33.388 20:54:48 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:33.388 20:54:48 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:07:33.388 20:54:48 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:07:33.388 20:54:48 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:07:33.388 20:54:48 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:07:33.388 20:54:48 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:07:33.388 20:54:48 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:07:33.388 20:54:48 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:07:33.388 20:54:48 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:07:33.388 20:54:48 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:07:33.388 20:54:48 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:07:33.388 20:54:48 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:07:33.388 20:54:48 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:33.388 20:54:48 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:07:33.388 20:54:48 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:07:33.388 20:54:48 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:33.388 20:54:48 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:33.388 20:54:48 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:33.388 20:54:48 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:33.388 20:54:48 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:33.388 20:54:48 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:33.388 20:54:48 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:33.389 20:54:48 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:33.389 20:54:48 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:33.389 20:54:48 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:33.389 20:54:48 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:33.389 20:54:48 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:33.389 20:54:48 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:33.389 20:54:48 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:33.389 20:54:48 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:33.389 20:54:48 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:33.389 #define SPDK_CONFIG_H 00:07:33.389 #define SPDK_CONFIG_APPS 1 00:07:33.389 #define SPDK_CONFIG_ARCH native 00:07:33.389 #undef SPDK_CONFIG_ASAN 00:07:33.389 #undef SPDK_CONFIG_AVAHI 00:07:33.389 #undef SPDK_CONFIG_CET 00:07:33.389 #define SPDK_CONFIG_COVERAGE 1 00:07:33.389 #define SPDK_CONFIG_CROSS_PREFIX 00:07:33.389 #undef SPDK_CONFIG_CRYPTO 00:07:33.389 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:33.389 #undef SPDK_CONFIG_CUSTOMOCF 00:07:33.389 #undef SPDK_CONFIG_DAOS 00:07:33.389 #define SPDK_CONFIG_DAOS_DIR 00:07:33.389 #define SPDK_CONFIG_DEBUG 1 00:07:33.389 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:33.389 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:33.389 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:33.389 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.389 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:33.389 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:33.389 #define SPDK_CONFIG_EXAMPLES 1 00:07:33.389 #undef SPDK_CONFIG_FC 00:07:33.389 #define SPDK_CONFIG_FC_PATH 00:07:33.389 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:33.389 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:33.389 #undef SPDK_CONFIG_FUSE 00:07:33.389 #define SPDK_CONFIG_FUZZER 1 00:07:33.389 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:33.389 #undef SPDK_CONFIG_GOLANG 00:07:33.389 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:33.389 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:33.389 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:33.389 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:07:33.389 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:33.389 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:33.389 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:33.389 #define SPDK_CONFIG_IDXD 1 00:07:33.389 #undef SPDK_CONFIG_IDXD_KERNEL 00:07:33.389 #undef SPDK_CONFIG_IPSEC_MB 00:07:33.389 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:33.389 #define SPDK_CONFIG_ISAL 1 00:07:33.389 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:33.389 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:33.389 #define SPDK_CONFIG_LIBDIR 00:07:33.389 #undef SPDK_CONFIG_LTO 00:07:33.389 #define SPDK_CONFIG_MAX_LCORES 00:07:33.389 #define SPDK_CONFIG_NVME_CUSE 1 00:07:33.389 #undef SPDK_CONFIG_OCF 00:07:33.389 #define SPDK_CONFIG_OCF_PATH 00:07:33.389 #define SPDK_CONFIG_OPENSSL_PATH 00:07:33.389 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:33.389 #define SPDK_CONFIG_PGO_DIR 00:07:33.389 #undef SPDK_CONFIG_PGO_USE 00:07:33.389 #define SPDK_CONFIG_PREFIX /usr/local 00:07:33.389 #undef SPDK_CONFIG_RAID5F 00:07:33.389 #undef SPDK_CONFIG_RBD 00:07:33.389 #define SPDK_CONFIG_RDMA 1 00:07:33.389 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:33.389 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:33.389 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:33.389 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:33.389 #undef SPDK_CONFIG_SHARED 00:07:33.389 #undef SPDK_CONFIG_SMA 00:07:33.389 #define SPDK_CONFIG_TESTS 1 00:07:33.389 #undef SPDK_CONFIG_TSAN 00:07:33.389 #define SPDK_CONFIG_UBLK 1 00:07:33.389 #define SPDK_CONFIG_UBSAN 1 00:07:33.389 #undef SPDK_CONFIG_UNIT_TESTS 00:07:33.389 #undef SPDK_CONFIG_URING 00:07:33.389 #define SPDK_CONFIG_URING_PATH 00:07:33.389 #undef SPDK_CONFIG_URING_ZNS 00:07:33.389 #undef SPDK_CONFIG_USDT 00:07:33.389 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:33.389 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:33.389 #define SPDK_CONFIG_VFIO_USER 1 00:07:33.389 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:33.389 #define SPDK_CONFIG_VHOST 1 00:07:33.389 #define SPDK_CONFIG_VIRTIO 1 00:07:33.389 #undef SPDK_CONFIG_VTUNE 00:07:33.389 #define SPDK_CONFIG_VTUNE_DIR 00:07:33.389 #define SPDK_CONFIG_WERROR 1 00:07:33.389 #define SPDK_CONFIG_WPDK_DIR 00:07:33.389 #undef SPDK_CONFIG_XNVME 00:07:33.389 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:33.389 20:54:48 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:33.389 20:54:48 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:33.389 20:54:48 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:33.389 20:54:48 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:33.389 20:54:48 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:33.389 20:54:48 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.389 20:54:48 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.389 20:54:48 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.389 20:54:48 -- paths/export.sh@5 -- # export PATH 00:07:33.389 20:54:48 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.389 20:54:48 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:33.389 20:54:48 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:33.389 20:54:48 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:33.389 20:54:49 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:33.389 20:54:49 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:33.389 20:54:49 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:33.389 20:54:49 -- pm/common@67 -- # TEST_TAG=N/A 00:07:33.389 20:54:49 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:33.389 20:54:49 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:33.389 20:54:49 -- pm/common@71 -- # uname -s 00:07:33.389 20:54:49 -- pm/common@71 -- # PM_OS=Linux 00:07:33.389 20:54:49 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:33.389 20:54:49 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:07:33.389 20:54:49 -- pm/common@76 -- # [[ Linux == Linux ]] 00:07:33.389 20:54:49 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:07:33.389 20:54:49 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:07:33.389 20:54:49 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:33.389 20:54:49 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:33.389 20:54:49 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:07:33.389 20:54:49 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:07:33.389 20:54:49 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:33.389 20:54:49 -- common/autotest_common.sh@57 -- # : 1 00:07:33.389 20:54:49 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:07:33.389 20:54:49 -- common/autotest_common.sh@61 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:33.389 20:54:49 -- common/autotest_common.sh@63 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:07:33.389 20:54:49 -- common/autotest_common.sh@65 -- # : 1 00:07:33.389 20:54:49 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:33.389 20:54:49 -- common/autotest_common.sh@67 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:07:33.389 20:54:49 -- common/autotest_common.sh@69 -- # : 00:07:33.389 20:54:49 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:07:33.389 20:54:49 -- common/autotest_common.sh@71 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:07:33.389 20:54:49 -- common/autotest_common.sh@73 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:07:33.389 20:54:49 -- common/autotest_common.sh@75 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:07:33.389 20:54:49 -- common/autotest_common.sh@77 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:33.389 20:54:49 -- common/autotest_common.sh@79 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:07:33.389 20:54:49 -- common/autotest_common.sh@81 -- # : 0 00:07:33.389 20:54:49 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:07:33.389 20:54:49 -- common/autotest_common.sh@83 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:07:33.390 20:54:49 -- common/autotest_common.sh@85 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:07:33.390 20:54:49 -- common/autotest_common.sh@87 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:07:33.390 20:54:49 -- common/autotest_common.sh@89 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:07:33.390 20:54:49 -- common/autotest_common.sh@91 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:07:33.390 20:54:49 -- common/autotest_common.sh@93 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:07:33.390 20:54:49 -- common/autotest_common.sh@95 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:33.390 20:54:49 -- common/autotest_common.sh@97 -- # : 1 00:07:33.390 20:54:49 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:07:33.390 20:54:49 -- common/autotest_common.sh@99 -- # : 1 00:07:33.390 20:54:49 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:07:33.390 20:54:49 -- common/autotest_common.sh@101 -- # : rdma 00:07:33.390 20:54:49 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:33.390 20:54:49 -- common/autotest_common.sh@103 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:07:33.390 20:54:49 -- common/autotest_common.sh@105 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:07:33.390 20:54:49 -- common/autotest_common.sh@107 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:07:33.390 20:54:49 -- common/autotest_common.sh@109 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:07:33.390 20:54:49 -- common/autotest_common.sh@111 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:07:33.390 20:54:49 -- common/autotest_common.sh@113 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:07:33.390 20:54:49 -- common/autotest_common.sh@115 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:07:33.390 20:54:49 -- common/autotest_common.sh@117 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:33.390 20:54:49 -- common/autotest_common.sh@119 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:07:33.390 20:54:49 -- common/autotest_common.sh@121 -- # : 1 00:07:33.390 20:54:49 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:07:33.390 20:54:49 -- common/autotest_common.sh@123 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:33.390 20:54:49 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:33.390 20:54:49 -- common/autotest_common.sh@125 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:07:33.390 20:54:49 -- common/autotest_common.sh@127 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:07:33.390 20:54:49 -- common/autotest_common.sh@129 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:07:33.390 20:54:49 -- common/autotest_common.sh@131 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:07:33.390 20:54:49 -- common/autotest_common.sh@133 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:07:33.390 20:54:49 -- common/autotest_common.sh@135 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:07:33.390 20:54:49 -- common/autotest_common.sh@137 -- # : main 00:07:33.390 20:54:49 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:07:33.390 20:54:49 -- common/autotest_common.sh@139 -- # : true 00:07:33.390 20:54:49 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:07:33.390 20:54:49 -- common/autotest_common.sh@141 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:07:33.390 20:54:49 -- common/autotest_common.sh@143 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:07:33.390 20:54:49 -- common/autotest_common.sh@145 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:07:33.390 20:54:49 -- common/autotest_common.sh@147 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:07:33.390 20:54:49 -- common/autotest_common.sh@149 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:07:33.390 20:54:49 -- common/autotest_common.sh@151 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:07:33.390 20:54:49 -- common/autotest_common.sh@153 -- # : 00:07:33.390 20:54:49 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:07:33.390 20:54:49 -- common/autotest_common.sh@155 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:07:33.390 20:54:49 -- common/autotest_common.sh@157 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:07:33.390 20:54:49 -- common/autotest_common.sh@159 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:07:33.390 20:54:49 -- common/autotest_common.sh@161 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:07:33.390 20:54:49 -- common/autotest_common.sh@163 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:07:33.390 20:54:49 -- common/autotest_common.sh@166 -- # : 00:07:33.390 20:54:49 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:07:33.390 20:54:49 -- common/autotest_common.sh@168 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:07:33.390 20:54:49 -- common/autotest_common.sh@170 -- # : 0 00:07:33.390 20:54:49 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:33.390 20:54:49 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.390 20:54:49 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:33.390 20:54:49 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:33.390 20:54:49 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:33.390 20:54:49 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:33.390 20:54:49 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:33.390 20:54:49 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:07:33.390 20:54:49 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:33.390 20:54:49 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:33.390 20:54:49 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:33.390 20:54:49 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:33.390 20:54:49 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:33.390 20:54:49 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:07:33.390 20:54:49 -- common/autotest_common.sh@199 -- # cat 00:07:33.390 20:54:49 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:07:33.390 20:54:49 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:33.390 20:54:49 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:33.390 20:54:49 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:33.390 20:54:49 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:33.390 20:54:49 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:07:33.391 20:54:49 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:07:33.391 20:54:49 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:33.391 20:54:49 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:33.391 20:54:49 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:33.391 20:54:49 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:33.391 20:54:49 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:33.391 20:54:49 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:33.391 20:54:49 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:33.391 20:54:49 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:33.391 20:54:49 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:33.391 20:54:49 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:33.391 20:54:49 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:33.391 20:54:49 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:33.391 20:54:49 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:07:33.391 20:54:49 -- common/autotest_common.sh@252 -- # export valgrind= 00:07:33.391 20:54:49 -- common/autotest_common.sh@252 -- # valgrind= 00:07:33.391 20:54:49 -- common/autotest_common.sh@258 -- # uname -s 00:07:33.651 20:54:49 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:07:33.651 20:54:49 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:07:33.651 20:54:49 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:07:33.651 20:54:49 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:07:33.651 20:54:49 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:07:33.651 20:54:49 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:07:33.651 20:54:49 -- common/autotest_common.sh@268 -- # MAKE=make 00:07:33.651 20:54:49 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j112 00:07:33.651 20:54:49 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:07:33.651 20:54:49 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:07:33.651 20:54:49 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:07:33.651 20:54:49 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:07:33.651 20:54:49 -- common/autotest_common.sh@307 -- # [[ -z 192226 ]] 00:07:33.651 20:54:49 -- common/autotest_common.sh@307 -- # kill -0 192226 00:07:33.651 20:54:49 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:33.651 20:54:49 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:07:33.651 20:54:49 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:07:33.651 20:54:49 -- common/autotest_common.sh@320 -- # local mount target_dir 00:07:33.651 20:54:49 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:07:33.651 20:54:49 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:07:33.651 20:54:49 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:07:33.651 20:54:49 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:07:33.651 20:54:49 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.2RnLpk 00:07:33.651 20:54:49 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:33.651 20:54:49 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:07:33.651 20:54:49 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:07:33.651 20:54:49 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.2RnLpk/tests/nvmf /tmp/spdk.2RnLpk 00:07:33.651 20:54:49 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@316 -- # df -T 00:07:33.652 20:54:49 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=1052192768 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=4232237056 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=52741152768 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=61742305280 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=9001152512 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=30868537344 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871150592 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=2613248 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=12342480896 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=12348461056 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=5980160 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=30870736896 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871154688 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=417792 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # avails["$mount"]=6174224384 00:07:33.652 20:54:49 -- common/autotest_common.sh@351 -- # sizes["$mount"]=6174228480 00:07:33.652 20:54:49 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:07:33.652 20:54:49 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:33.652 20:54:49 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:07:33.652 * Looking for test storage... 00:07:33.652 20:54:49 -- common/autotest_common.sh@357 -- # local target_space new_size 00:07:33.652 20:54:49 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:07:33.652 20:54:49 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:33.652 20:54:49 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.652 20:54:49 -- common/autotest_common.sh@361 -- # mount=/ 00:07:33.652 20:54:49 -- common/autotest_common.sh@363 -- # target_space=52741152768 00:07:33.652 20:54:49 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:07:33.652 20:54:49 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:07:33.652 20:54:49 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:07:33.652 20:54:49 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:07:33.652 20:54:49 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:07:33.652 20:54:49 -- common/autotest_common.sh@370 -- # new_size=11215745024 00:07:33.652 20:54:49 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:33.652 20:54:49 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.652 20:54:49 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.652 20:54:49 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.652 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.652 20:54:49 -- common/autotest_common.sh@378 -- # return 0 00:07:33.652 20:54:49 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:33.652 20:54:49 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:33.652 20:54:49 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:33.652 20:54:49 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:33.652 20:54:49 -- common/autotest_common.sh@1673 -- # true 00:07:33.652 20:54:49 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:33.652 20:54:49 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:33.652 20:54:49 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:33.652 20:54:49 -- common/autotest_common.sh@27 -- # exec 00:07:33.652 20:54:49 -- common/autotest_common.sh@29 -- # exec 00:07:33.652 20:54:49 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:33.652 20:54:49 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:33.652 20:54:49 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:33.652 20:54:49 -- common/autotest_common.sh@18 -- # set -x 00:07:33.652 20:54:49 -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:33.652 20:54:49 -- ../common.sh@8 -- # pids=() 00:07:33.652 20:54:49 -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:33.652 20:54:49 -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:33.652 20:54:49 -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:33.652 20:54:49 -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:33.652 20:54:49 -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:33.652 20:54:49 -- nvmf/run.sh@69 -- # mem_size=512 00:07:33.652 20:54:49 -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:33.652 20:54:49 -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:33.652 20:54:49 -- ../common.sh@69 -- # local fuzz_num=25 00:07:33.652 20:54:49 -- ../common.sh@70 -- # local time=1 00:07:33.652 20:54:49 -- ../common.sh@72 -- # (( i = 0 )) 00:07:33.652 20:54:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.652 20:54:49 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:33.652 20:54:49 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:33.652 20:54:49 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.652 20:54:49 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.652 20:54:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:33.652 20:54:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:33.652 20:54:49 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:33.652 20:54:49 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:33.652 20:54:49 -- nvmf/run.sh@34 -- # printf %02d 0 00:07:33.652 20:54:49 -- nvmf/run.sh@34 -- # port=4400 00:07:33.652 20:54:49 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:33.652 20:54:49 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:33.652 20:54:49 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.652 20:54:49 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:33.652 20:54:49 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:33.652 20:54:49 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:33.652 [2024-04-25 20:54:49.120060] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:33.652 [2024-04-25 20:54:49.120114] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192277 ] 00:07:33.652 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.652 [2024-04-25 20:54:49.257443] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:33.652 [2024-04-25 20:54:49.294468] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.912 [2024-04-25 20:54:49.313779] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.912 [2024-04-25 20:54:49.365893] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.912 [2024-04-25 20:54:49.382225] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:33.912 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.912 INFO: Seed: 2486881451 00:07:33.912 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:33.912 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:33.912 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:33.912 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.912 #2 INITED exec/s: 0 rss: 61Mb 00:07:33.912 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.912 This may also happen if the target rejected all inputs we tried so far 00:07:33.912 [2024-04-25 20:54:49.427347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.912 [2024-04-25 20:54:49.427375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.171 NEW_FUNC[1/671]: 0x4a3c60 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:34.171 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.171 #3 NEW cov: 11638 ft: 11644 corp: 2/102b lim: 320 exec/s: 0 rss: 68Mb L: 101/101 MS: 1 InsertRepeatedBytes- 00:07:34.171 [2024-04-25 20:54:49.728117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.171 [2024-04-25 20:54:49.728151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.171 #9 NEW cov: 11773 ft: 12149 corp: 3/203b lim: 320 exec/s: 0 rss: 68Mb L: 101/101 MS: 1 ChangeByte- 00:07:34.171 [2024-04-25 20:54:49.778141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.171 [2024-04-25 20:54:49.778169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.171 #10 NEW cov: 11779 ft: 12354 corp: 4/304b lim: 320 exec/s: 0 rss: 68Mb L: 101/101 MS: 1 ChangeByte- 00:07:34.171 [2024-04-25 20:54:49.818296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.171 [2024-04-25 20:54:49.818322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.430 #11 NEW cov: 11890 ft: 12709 corp: 5/408b lim: 320 exec/s: 0 rss: 68Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:07:34.430 [2024-04-25 20:54:49.858596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.430 [2024-04-25 20:54:49.858621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.430 [2024-04-25 20:54:49.858681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.430 [2024-04-25 20:54:49.858695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.430 [2024-04-25 20:54:49.858753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.430 [2024-04-25 20:54:49.858766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.430 #12 NEW cov: 11890 ft: 13038 corp: 6/627b lim: 320 exec/s: 0 rss: 68Mb L: 219/219 MS: 1 InsertRepeatedBytes- 00:07:34.431 [2024-04-25 20:54:49.898447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:dfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff5bff 00:07:34.431 [2024-04-25 20:54:49.898471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.431 #18 NEW cov: 11890 ft: 13090 corp: 7/729b lim: 320 exec/s: 0 rss: 69Mb L: 102/219 MS: 1 InsertByte- 00:07:34.431 [2024-04-25 20:54:49.938581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff5bff 00:07:34.431 [2024-04-25 20:54:49.938605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.431 #20 NEW cov: 11890 ft: 13220 corp: 8/846b lim: 320 exec/s: 0 rss: 69Mb L: 117/219 MS: 2 EraseBytes-CrossOver- 00:07:34.431 [2024-04-25 20:54:49.978806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.431 [2024-04-25 20:54:49.978832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.431 [2024-04-25 20:54:49.978891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.431 [2024-04-25 20:54:49.978908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.431 #21 NEW cov: 11890 ft: 13418 corp: 9/1019b lim: 320 exec/s: 0 rss: 69Mb L: 173/219 MS: 1 CrossOver- 00:07:34.431 [2024-04-25 20:54:50.018859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ff2effff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.431 [2024-04-25 20:54:50.018908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.431 #22 NEW cov: 11890 ft: 13435 corp: 10/1121b lim: 320 exec/s: 0 rss: 69Mb L: 102/219 MS: 1 InsertByte- 00:07:34.431 [2024-04-25 20:54:50.059060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:fdffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.431 [2024-04-25 20:54:50.059087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.431 [2024-04-25 20:54:50.059149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.431 [2024-04-25 20:54:50.059163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.431 #23 NEW cov: 11890 ft: 13503 corp: 11/1294b lim: 320 exec/s: 0 rss: 69Mb L: 173/219 MS: 1 ChangeBinInt- 00:07:34.690 [2024-04-25 20:54:50.099278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.099306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.690 [2024-04-25 20:54:50.099368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.099382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.690 [2024-04-25 20:54:50.099441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.099454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.690 #24 NEW cov: 11890 ft: 13549 corp: 12/1513b lim: 320 exec/s: 0 rss: 69Mb L: 219/219 MS: 1 ChangeBit- 00:07:34.690 [2024-04-25 20:54:50.139287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:fdffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.139313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.690 [2024-04-25 20:54:50.139373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffff3bff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.139388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.690 #25 NEW cov: 11890 ft: 13575 corp: 13/1687b lim: 320 exec/s: 0 rss: 69Mb L: 174/219 MS: 1 InsertByte- 00:07:34.690 [2024-04-25 20:54:50.179301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.179326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.690 #27 NEW cov: 11890 ft: 13597 corp: 14/1814b lim: 320 exec/s: 0 rss: 69Mb L: 127/219 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:34.690 [2024-04-25 20:54:50.219616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.690 [2024-04-25 20:54:50.219642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.691 [2024-04-25 20:54:50.219704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.691 [2024-04-25 20:54:50.219718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.691 [2024-04-25 20:54:50.219782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ea72d53a cdw11:ff0076fc SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.691 [2024-04-25 20:54:50.219796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.691 #28 NEW cov: 11890 ft: 13618 corp: 15/2041b lim: 320 exec/s: 0 rss: 70Mb L: 227/227 MS: 1 CMP- DE: "q:\325r\352\374v\000"- 00:07:34.691 [2024-04-25 20:54:50.259593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.691 [2024-04-25 20:54:50.259619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.691 #34 NEW cov: 11890 ft: 13683 corp: 16/2143b lim: 320 exec/s: 0 rss: 70Mb L: 102/227 MS: 1 InsertByte- 00:07:34.691 [2024-04-25 20:54:50.299631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.691 [2024-04-25 20:54:50.299656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.691 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.691 #35 NEW cov: 11913 ft: 13692 corp: 17/2244b lim: 320 exec/s: 0 rss: 70Mb L: 101/227 MS: 1 ChangeByte- 00:07:34.691 [2024-04-25 20:54:50.339816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.691 [2024-04-25 20:54:50.339841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.691 [2024-04-25 20:54:50.339902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.691 [2024-04-25 20:54:50.339916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.950 #36 NEW cov: 11913 ft: 13709 corp: 18/2417b lim: 320 exec/s: 0 rss: 70Mb L: 173/227 MS: 1 ShuffleBytes- 00:07:34.950 [2024-04-25 20:54:50.379997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.950 [2024-04-25 20:54:50.380023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.950 [2024-04-25 20:54:50.380084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.950 [2024-04-25 20:54:50.380098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.950 [2024-04-25 20:54:50.380155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.950 [2024-04-25 20:54:50.380171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.950 #37 NEW cov: 11913 ft: 13751 corp: 19/2637b lim: 320 exec/s: 0 rss: 70Mb L: 220/227 MS: 1 InsertByte- 00:07:34.950 [2024-04-25 20:54:50.420115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.950 [2024-04-25 20:54:50.420140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.950 [2024-04-25 20:54:50.420202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.950 [2024-04-25 20:54:50.420215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.950 [2024-04-25 20:54:50.420277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.420291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.951 #38 NEW cov: 11913 ft: 13793 corp: 20/2857b lim: 320 exec/s: 38 rss: 70Mb L: 220/227 MS: 1 CopyPart- 00:07:34.951 [2024-04-25 20:54:50.460244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.460269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.460331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.460345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.460408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.460422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.951 #39 NEW cov: 11913 ft: 13814 corp: 21/3077b lim: 320 exec/s: 39 rss: 70Mb L: 220/227 MS: 1 ChangeBit- 00:07:34.951 [2024-04-25 20:54:50.500322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.500347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.500413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.500427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.500487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.500502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.951 #40 NEW cov: 11913 ft: 13826 corp: 22/3298b lim: 320 exec/s: 40 rss: 70Mb L: 221/227 MS: 1 InsertByte- 00:07:34.951 [2024-04-25 20:54:50.540498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.540526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.540587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.540602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.540663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.540677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.951 #41 NEW cov: 11913 ft: 13873 corp: 23/3518b lim: 320 exec/s: 41 rss: 70Mb L: 220/227 MS: 1 ChangeByte- 00:07:34.951 [2024-04-25 20:54:50.580458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.580483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.951 [2024-04-25 20:54:50.580544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.951 [2024-04-25 20:54:50.580559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.951 #42 NEW cov: 11913 ft: 13881 corp: 24/3696b lim: 320 exec/s: 42 rss: 70Mb L: 178/227 MS: 1 EraseBytes- 00:07:35.211 [2024-04-25 20:54:50.620710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.211 [2024-04-25 20:54:50.620735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.211 [2024-04-25 20:54:50.620796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.211 [2024-04-25 20:54:50.620810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.211 [2024-04-25 20:54:50.620873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.211 [2024-04-25 20:54:50.620887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.211 #43 NEW cov: 11913 ft: 13891 corp: 25/3919b lim: 320 exec/s: 43 rss: 70Mb L: 223/227 MS: 1 InsertRepeatedBytes- 00:07:35.211 [2024-04-25 20:54:50.660628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.211 [2024-04-25 20:54:50.660653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.211 #44 NEW cov: 11913 ft: 13906 corp: 26/4020b lim: 320 exec/s: 44 rss: 70Mb L: 101/227 MS: 1 CrossOver- 00:07:35.211 [2024-04-25 20:54:50.700947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.211 [2024-04-25 20:54:50.700971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.211 [2024-04-25 20:54:50.701040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.211 [2024-04-25 20:54:50.701058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.701122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.701135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.212 #45 NEW cov: 11913 ft: 13909 corp: 27/4241b lim: 320 exec/s: 45 rss: 70Mb L: 221/227 MS: 1 ChangeByte- 00:07:35.212 [2024-04-25 20:54:50.740942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.740968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.741041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.741071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.212 #46 NEW cov: 11913 ft: 13917 corp: 28/4414b lim: 320 exec/s: 46 rss: 70Mb L: 173/227 MS: 1 ChangeBinInt- 00:07:35.212 [2024-04-25 20:54:50.781164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.781189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.781250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.781264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.781322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.781336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.212 #52 NEW cov: 11913 ft: 13925 corp: 29/4633b lim: 320 exec/s: 52 rss: 70Mb L: 219/227 MS: 1 ShuffleBytes- 00:07:35.212 [2024-04-25 20:54:50.821293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff3dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffff00 00:07:35.212 [2024-04-25 20:54:50.821318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.821379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.821394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.821455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffdfffffffffff 00:07:35.212 [2024-04-25 20:54:50.821469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.212 #53 NEW cov: 11913 ft: 13927 corp: 30/4861b lim: 320 exec/s: 53 rss: 70Mb L: 228/228 MS: 1 PersAutoDict- DE: "q:\325r\352\374v\000"- 00:07:35.212 [2024-04-25 20:54:50.861306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:fdffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.861331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.212 [2024-04-25 20:54:50.861391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.212 [2024-04-25 20:54:50.861405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.471 #54 NEW cov: 11913 ft: 13935 corp: 31/5021b lim: 320 exec/s: 54 rss: 70Mb L: 160/228 MS: 1 EraseBytes- 00:07:35.471 [2024-04-25 20:54:50.901320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.471 [2024-04-25 20:54:50.901344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.471 #55 NEW cov: 11913 ft: 13970 corp: 32/5122b lim: 320 exec/s: 55 rss: 70Mb L: 101/228 MS: 1 ShuffleBytes- 00:07:35.471 [2024-04-25 20:54:50.931539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.471 [2024-04-25 20:54:50.931564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.471 [2024-04-25 20:54:50.931625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.471 [2024-04-25 20:54:50.931639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.471 [2024-04-25 20:54:50.931700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.471 [2024-04-25 20:54:50.931714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.471 #56 NEW cov: 11913 ft: 13973 corp: 33/5343b lim: 320 exec/s: 56 rss: 70Mb L: 221/228 MS: 1 CopyPart- 00:07:35.472 [2024-04-25 20:54:50.971685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:50.971710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:50.971772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:50.971786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:50.971844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ea72d53a cdw11:ff0076fc SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:50.971857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.472 #57 NEW cov: 11913 ft: 13989 corp: 34/5571b lim: 320 exec/s: 57 rss: 70Mb L: 228/228 MS: 1 InsertByte- 00:07:35.472 [2024-04-25 20:54:51.011828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.011853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:51.011916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.011930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:51.011988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.012007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.472 #58 NEW cov: 11913 ft: 14021 corp: 35/5785b lim: 320 exec/s: 58 rss: 70Mb L: 214/228 MS: 1 EraseBytes- 00:07:35.472 [2024-04-25 20:54:51.051749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:d53a71ff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.051774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.472 #59 NEW cov: 11913 ft: 14040 corp: 36/5910b lim: 320 exec/s: 59 rss: 70Mb L: 125/228 MS: 1 PersAutoDict- DE: "q:\325r\352\374v\000"- 00:07:35.472 [2024-04-25 20:54:51.091999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:fdffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.092024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:51.092083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.092099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.472 #60 NEW cov: 11913 ft: 14112 corp: 37/6071b lim: 320 exec/s: 60 rss: 70Mb L: 161/228 MS: 1 InsertByte- 00:07:35.472 [2024-04-25 20:54:51.132221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff3dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffff00 00:07:35.472 [2024-04-25 20:54:51.132247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:51.132308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:dfffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.472 [2024-04-25 20:54:51.132322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.472 [2024-04-25 20:54:51.132384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffdfffffffffff 00:07:35.472 [2024-04-25 20:54:51.132398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.732 #61 NEW cov: 11913 ft: 14125 corp: 38/6299b lim: 320 exec/s: 61 rss: 70Mb L: 228/228 MS: 1 ChangeBit- 00:07:35.732 [2024-04-25 20:54:51.172073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.172097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 #62 NEW cov: 11913 ft: 14135 corp: 39/6400b lim: 320 exec/s: 62 rss: 70Mb L: 101/228 MS: 1 ShuffleBytes- 00:07:35.732 [2024-04-25 20:54:51.212482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.212510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.212572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.212586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.212651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.212664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.212724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:7 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.212738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.732 #63 NEW cov: 11913 ft: 14356 corp: 40/6667b lim: 320 exec/s: 63 rss: 70Mb L: 267/267 MS: 1 CrossOver- 00:07:35.732 [2024-04-25 20:54:51.252360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.252385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 #64 NEW cov: 11913 ft: 14429 corp: 41/6771b lim: 320 exec/s: 64 rss: 70Mb L: 104/267 MS: 1 ChangeBinInt- 00:07:35.732 [2024-04-25 20:54:51.292649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.292675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.292737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.292752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.292813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:0076fcea cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.292827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.732 #65 NEW cov: 11913 ft: 14451 corp: 42/6991b lim: 320 exec/s: 65 rss: 71Mb L: 220/267 MS: 1 PersAutoDict- DE: "q:\325r\352\374v\000"- 00:07:35.732 [2024-04-25 20:54:51.332863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.332890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.332948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.332962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.333022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ea72d53a cdw11:ff0076fc SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.732 [2024-04-25 20:54:51.333036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.732 [2024-04-25 20:54:51.333097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:7 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffff 00:07:35.732 [2024-04-25 20:54:51.333111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.732 #66 NEW cov: 11913 ft: 14479 corp: 43/7305b lim: 320 exec/s: 66 rss: 71Mb L: 314/314 MS: 1 InsertRepeatedBytes- 00:07:35.732 [2024-04-25 20:54:51.372748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffdfffffff 00:07:35.732 [2024-04-25 20:54:51.372774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.992 #67 NEW cov: 11913 ft: 14504 corp: 44/7406b lim: 320 exec/s: 67 rss: 71Mb L: 101/314 MS: 1 ChangeBit- 00:07:35.992 [2024-04-25 20:54:51.413014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.992 [2024-04-25 20:54:51.413039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.992 [2024-04-25 20:54:51.413098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.992 [2024-04-25 20:54:51.413112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.992 [2024-04-25 20:54:51.413166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ea72d53a cdw11:ff0076fc SGL TRANSPORT DATA BLOCK TRANSPORT 0x2ffffff 00:07:35.992 [2024-04-25 20:54:51.413179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.992 #68 NEW cov: 11913 ft: 14515 corp: 45/7633b lim: 320 exec/s: 34 rss: 71Mb L: 227/314 MS: 1 ChangeBinInt- 00:07:35.992 #68 DONE cov: 11913 ft: 14515 corp: 45/7633b lim: 320 exec/s: 34 rss: 71Mb 00:07:35.992 ###### Recommended dictionary. ###### 00:07:35.992 "q:\325r\352\374v\000" # Uses: 3 00:07:35.992 ###### End of recommended dictionary. ###### 00:07:35.992 Done 68 runs in 2 second(s) 00:07:35.992 20:54:51 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:35.992 20:54:51 -- ../common.sh@72 -- # (( i++ )) 00:07:35.992 20:54:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.992 20:54:51 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:35.992 20:54:51 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:35.992 20:54:51 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.992 20:54:51 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.992 20:54:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.992 20:54:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:35.992 20:54:51 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:35.992 20:54:51 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:35.992 20:54:51 -- nvmf/run.sh@34 -- # printf %02d 1 00:07:35.992 20:54:51 -- nvmf/run.sh@34 -- # port=4401 00:07:35.992 20:54:51 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.992 20:54:51 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:35.992 20:54:51 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.992 20:54:51 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:35.992 20:54:51 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:35.992 20:54:51 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:35.992 [2024-04-25 20:54:51.580519] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:35.992 [2024-04-25 20:54:51.580591] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid192791 ] 00:07:35.992 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.252 [2024-04-25 20:54:51.721064] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:36.252 [2024-04-25 20:54:51.758413] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.252 [2024-04-25 20:54:51.777489] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.252 [2024-04-25 20:54:51.829523] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:36.252 [2024-04-25 20:54:51.845851] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:36.252 INFO: Running with entropic power schedule (0xFF, 100). 00:07:36.252 INFO: Seed: 654896091 00:07:36.252 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:36.252 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:36.252 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:36.252 INFO: A corpus is not provided, starting from an empty corpus 00:07:36.252 #2 INITED exec/s: 0 rss: 61Mb 00:07:36.252 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:36.252 This may also happen if the target rejected all inputs we tried so far 00:07:36.252 [2024-04-25 20:54:51.910912] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:36.252 [2024-04-25 20:54:51.911226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.252 [2024-04-25 20:54:51.911256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.252 [2024-04-25 20:54:51.911313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.252 [2024-04-25 20:54:51.911327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.770 NEW_FUNC[1/671]: 0x4a4560 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:36.770 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.770 #15 NEW cov: 11743 ft: 11743 corp: 2/18b lim: 30 exec/s: 0 rss: 68Mb L: 17/17 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:07:36.770 [2024-04-25 20:54:52.241902] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.770 [2024-04-25 20:54:52.242041] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.770 [2024-04-25 20:54:52.242150] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.770 [2024-04-25 20:54:52.242405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8a7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.770 [2024-04-25 20:54:52.242460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.242544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.242572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.242656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.242682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.771 #19 NEW cov: 11879 ft: 12719 corp: 3/37b lim: 30 exec/s: 0 rss: 68Mb L: 19/19 MS: 4 CopyPart-ChangeBit-ChangeBinInt-InsertRepeatedBytes- 00:07:36.771 [2024-04-25 20:54:52.291815] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:36.771 [2024-04-25 20:54:52.291929] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.771 [2024-04-25 20:54:52.292040] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.771 [2024-04-25 20:54:52.292243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.292269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.292324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.292338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.292393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.292407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.771 #25 NEW cov: 11885 ft: 12997 corp: 4/59b lim: 30 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:36.771 [2024-04-25 20:54:52.331919] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:36.771 [2024-04-25 20:54:52.332038] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.771 [2024-04-25 20:54:52.332145] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.771 [2024-04-25 20:54:52.332351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.332379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.332435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.332450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.332506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0a7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.332519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.771 #36 NEW cov: 11970 ft: 13226 corp: 5/82b lim: 30 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 CrossOver- 00:07:36.771 [2024-04-25 20:54:52.382125] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:36.771 [2024-04-25 20:54:52.382329] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x45 00:07:36.771 [2024-04-25 20:54:52.382450] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:36.771 [2024-04-25 20:54:52.382655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.382681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.382742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.382756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.382812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.382826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.771 [2024-04-25 20:54:52.382880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.771 [2024-04-25 20:54:52.382894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.771 #37 NEW cov: 11970 ft: 13754 corp: 6/108b lim: 30 exec/s: 0 rss: 69Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:36.771 [2024-04-25 20:54:52.432312] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:36.771 [2024-04-25 20:54:52.432515] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.771 [2024-04-25 20:54:52.432619] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:36.771 [2024-04-25 20:54:52.432738] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf5 00:07:37.031 [2024-04-25 20:54:52.432946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8a7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.432974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.433034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.433049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.433104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.433119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.433175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.433189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.433244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.433259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.031 #43 NEW cov: 11970 ft: 13888 corp: 7/138b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 CrossOver- 00:07:37.031 [2024-04-25 20:54:52.472393] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.031 [2024-04-25 20:54:52.472598] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000045 00:07:37.031 [2024-04-25 20:54:52.472713] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:37.031 [2024-04-25 20:54:52.472930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.472957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.473020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.473035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.473092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.473106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.473162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.473177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.031 #44 NEW cov: 11970 ft: 13972 corp: 8/164b lim: 30 exec/s: 0 rss: 69Mb L: 26/30 MS: 1 ChangeByte- 00:07:37.031 [2024-04-25 20:54:52.522447] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.031 [2024-04-25 20:54:52.522557] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.031 [2024-04-25 20:54:52.522661] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.031 [2024-04-25 20:54:52.522860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.522886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.522943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.522958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.523007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.523022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.031 #45 NEW cov: 11970 ft: 14067 corp: 9/186b lim: 30 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 CrossOver- 00:07:37.031 [2024-04-25 20:54:52.562579] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.031 [2024-04-25 20:54:52.562786] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000045 00:07:37.031 [2024-04-25 20:54:52.562894] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004523 00:07:37.031 [2024-04-25 20:54:52.563114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.563140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.563196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.563210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.563263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.563277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.563331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.563348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.031 #46 NEW cov: 11970 ft: 14106 corp: 10/213b lim: 30 exec/s: 0 rss: 69Mb L: 27/30 MS: 1 InsertByte- 00:07:37.031 [2024-04-25 20:54:52.602676] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.031 [2024-04-25 20:54:52.602789] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.031 [2024-04-25 20:54:52.602899] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.031 [2024-04-25 20:54:52.603107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.603132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.603187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.603201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.031 [2024-04-25 20:54:52.603259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7a837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.031 [2024-04-25 20:54:52.603272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.031 #47 NEW cov: 11970 ft: 14141 corp: 11/235b lim: 30 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 ChangeBit- 00:07:37.031 [2024-04-25 20:54:52.642858] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.032 [2024-04-25 20:54:52.643070] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000045 00:07:37.032 [2024-04-25 20:54:52.643177] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004523 00:07:37.032 [2024-04-25 20:54:52.643384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.032 [2024-04-25 20:54:52.643411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.032 [2024-04-25 20:54:52.643464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.032 [2024-04-25 20:54:52.643479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.032 [2024-04-25 20:54:52.643533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.032 [2024-04-25 20:54:52.643547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.032 [2024-04-25 20:54:52.643601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.032 [2024-04-25 20:54:52.643615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.032 #48 NEW cov: 11970 ft: 14157 corp: 12/263b lim: 30 exec/s: 0 rss: 69Mb L: 28/30 MS: 1 InsertByte- 00:07:37.032 [2024-04-25 20:54:52.692961] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.032 [2024-04-25 20:54:52.693085] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.291 [2024-04-25 20:54:52.693195] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.291 [2024-04-25 20:54:52.693406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.693435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.291 [2024-04-25 20:54:52.693492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.693507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.291 [2024-04-25 20:54:52.693560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7bf5837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.693574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.291 #49 NEW cov: 11970 ft: 14209 corp: 13/285b lim: 30 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 CopyPart- 00:07:37.291 [2024-04-25 20:54:52.732998] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4545 00:07:37.291 [2024-04-25 20:54:52.733111] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:37.291 [2024-04-25 20:54:52.733317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.733342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.291 [2024-04-25 20:54:52.733396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.733410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.291 #50 NEW cov: 11970 ft: 14245 corp: 14/298b lim: 30 exec/s: 0 rss: 69Mb L: 13/30 MS: 1 EraseBytes- 00:07:37.291 [2024-04-25 20:54:52.773169] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.291 [2024-04-25 20:54:52.773374] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000045 00:07:37.291 [2024-04-25 20:54:52.773479] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004523 00:07:37.291 [2024-04-25 20:54:52.773698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.773724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.291 [2024-04-25 20:54:52.773779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.291 [2024-04-25 20:54:52.773794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.291 [2024-04-25 20:54:52.773849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.773862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.773919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.773933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.292 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:37.292 #51 NEW cov: 11993 ft: 14320 corp: 15/326b lim: 30 exec/s: 0 rss: 69Mb L: 28/30 MS: 1 CopyPart- 00:07:37.292 [2024-04-25 20:54:52.823326] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.292 [2024-04-25 20:54:52.823443] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x71 00:07:37.292 [2024-04-25 20:54:52.823548] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (41728) > len (4) 00:07:37.292 [2024-04-25 20:54:52.823653] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:37.292 [2024-04-25 20:54:52.823875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.823901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.823955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.823970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.824038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.824053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.824108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.824123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.292 #52 NEW cov: 11999 ft: 14368 corp: 16/353b lim: 30 exec/s: 0 rss: 69Mb L: 27/30 MS: 1 InsertByte- 00:07:37.292 [2024-04-25 20:54:52.863399] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.292 [2024-04-25 20:54:52.863515] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b5d 00:07:37.292 [2024-04-25 20:54:52.863623] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.292 [2024-04-25 20:54:52.863833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.863859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.863915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.863930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.863985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b83f5 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.864004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.292 #53 NEW cov: 11999 ft: 14387 corp: 17/376b lim: 30 exec/s: 53 rss: 70Mb L: 23/30 MS: 1 InsertByte- 00:07:37.292 [2024-04-25 20:54:52.903421] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.292 [2024-04-25 20:54:52.903627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.903652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.292 #54 NEW cov: 11999 ft: 14784 corp: 18/385b lim: 30 exec/s: 54 rss: 70Mb L: 9/30 MS: 1 EraseBytes- 00:07:37.292 [2024-04-25 20:54:52.943593] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.292 [2024-04-25 20:54:52.943708] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.292 [2024-04-25 20:54:52.943818] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.292 [2024-04-25 20:54:52.944021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.944047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.944104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.944118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.292 [2024-04-25 20:54:52.944181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7a8382 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.292 [2024-04-25 20:54:52.944195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.576 #55 NEW cov: 11999 ft: 14792 corp: 19/407b lim: 30 exec/s: 55 rss: 70Mb L: 22/30 MS: 1 ChangeBinInt- 00:07:37.576 [2024-04-25 20:54:52.983643] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a2a 00:07:37.576 [2024-04-25 20:54:52.983852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a2a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:52.983878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.576 #57 NEW cov: 11999 ft: 14819 corp: 20/413b lim: 30 exec/s: 57 rss: 70Mb L: 6/30 MS: 2 CrossOver-CopyPart- 00:07:37.576 [2024-04-25 20:54:53.023872] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:37.576 [2024-04-25 20:54:53.024084] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.576 [2024-04-25 20:54:53.024190] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.576 [2024-04-25 20:54:53.024290] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf5 00:07:37.576 [2024-04-25 20:54:53.024500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:167b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.024526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.024580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.024594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.024651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.024665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.024719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.024732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.024788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.024802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.576 #58 NEW cov: 11999 ft: 14851 corp: 21/443b lim: 30 exec/s: 58 rss: 70Mb L: 30/30 MS: 1 ChangeByte- 00:07:37.576 [2024-04-25 20:54:53.073946] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:37.576 [2024-04-25 20:54:53.074067] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.576 [2024-04-25 20:54:53.074272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:167b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.074300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.074354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.074368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.576 #59 NEW cov: 11999 ft: 14867 corp: 22/460b lim: 30 exec/s: 59 rss: 70Mb L: 17/30 MS: 1 EraseBytes- 00:07:37.576 [2024-04-25 20:54:53.114077] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:37.576 [2024-04-25 20:54:53.114189] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.576 [2024-04-25 20:54:53.114294] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f51e 00:07:37.576 [2024-04-25 20:54:53.114499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:167b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.114525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.114581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.576 [2024-04-25 20:54:53.114595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.576 [2024-04-25 20:54:53.114652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.114667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.577 #60 NEW cov: 11999 ft: 14902 corp: 23/478b lim: 30 exec/s: 60 rss: 70Mb L: 18/30 MS: 1 InsertByte- 00:07:37.577 [2024-04-25 20:54:53.154177] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.577 [2024-04-25 20:54:53.154296] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.577 [2024-04-25 20:54:53.154405] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.577 [2024-04-25 20:54:53.154625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.154651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.577 [2024-04-25 20:54:53.154705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.154720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.577 [2024-04-25 20:54:53.154774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7a8392 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.154788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.577 #61 NEW cov: 11999 ft: 14917 corp: 24/500b lim: 30 exec/s: 61 rss: 70Mb L: 22/30 MS: 1 ChangeBit- 00:07:37.577 [2024-04-25 20:54:53.194312] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:37.577 [2024-04-25 20:54:53.194427] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.577 [2024-04-25 20:54:53.194636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:167b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.194662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.577 [2024-04-25 20:54:53.194717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:007b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.194732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.577 #62 NEW cov: 11999 ft: 14954 corp: 25/515b lim: 30 exec/s: 62 rss: 70Mb L: 15/30 MS: 1 EraseBytes- 00:07:37.577 [2024-04-25 20:54:53.234390] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a2a 00:07:37.577 [2024-04-25 20:54:53.234601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a2a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.577 [2024-04-25 20:54:53.234635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.840 #63 NEW cov: 11999 ft: 14967 corp: 26/525b lim: 30 exec/s: 63 rss: 70Mb L: 10/30 MS: 1 CopyPart- 00:07:37.840 [2024-04-25 20:54:53.274598] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.840 [2024-04-25 20:54:53.274804] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000045 00:07:37.840 [2024-04-25 20:54:53.274906] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004523 00:07:37.840 [2024-04-25 20:54:53.275128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.840 [2024-04-25 20:54:53.275153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.840 [2024-04-25 20:54:53.275209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.840 [2024-04-25 20:54:53.275223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.840 [2024-04-25 20:54:53.275277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.840 [2024-04-25 20:54:53.275291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.840 [2024-04-25 20:54:53.275343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.275358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.841 #64 NEW cov: 11999 ft: 15008 corp: 27/553b lim: 30 exec/s: 64 rss: 70Mb L: 28/30 MS: 1 ShuffleBytes- 00:07:37.841 [2024-04-25 20:54:53.314706] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xaab 00:07:37.841 [2024-04-25 20:54:53.314918] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.841 [2024-04-25 20:54:53.315046] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.841 [2024-04-25 20:54:53.315155] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf5 00:07:37.841 [2024-04-25 20:54:53.315369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8a7b0085 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.315395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.315452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.315469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.315525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.315539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.315592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.315607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.315661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.315674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.841 #65 NEW cov: 11999 ft: 15055 corp: 28/583b lim: 30 exec/s: 65 rss: 70Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:37.841 [2024-04-25 20:54:53.354775] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:37.841 [2024-04-25 20:54:53.354889] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.841 [2024-04-25 20:54:53.355004] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300003f7b 00:07:37.841 [2024-04-25 20:54:53.355213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.355238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.355294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.355309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.355363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0a7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.355378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.841 #66 NEW cov: 11999 ft: 15073 corp: 29/606b lim: 30 exec/s: 66 rss: 70Mb L: 23/30 MS: 1 ChangeByte- 00:07:37.841 [2024-04-25 20:54:53.394944] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.841 [2024-04-25 20:54:53.395155] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000045 00:07:37.841 [2024-04-25 20:54:53.395265] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:37.841 [2024-04-25 20:54:53.395471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.395496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.395550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.395565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.395619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.395638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.395695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.395709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.841 #67 NEW cov: 11999 ft: 15096 corp: 30/632b lim: 30 exec/s: 67 rss: 70Mb L: 26/30 MS: 1 ShuffleBytes- 00:07:37.841 [2024-04-25 20:54:53.435044] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:37.841 [2024-04-25 20:54:53.435250] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:37.841 [2024-04-25 20:54:53.435359] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4545 00:07:37.841 [2024-04-25 20:54:53.435570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.435595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.435652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.435666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.435720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.435733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.435788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45230045 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.435801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.841 #68 NEW cov: 11999 ft: 15112 corp: 31/656b lim: 30 exec/s: 68 rss: 70Mb L: 24/30 MS: 1 EraseBytes- 00:07:37.841 [2024-04-25 20:54:53.475190] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:37.841 [2024-04-25 20:54:53.475405] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786436) > buf size (4096) 00:07:37.841 [2024-04-25 20:54:53.475513] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:37.841 [2024-04-25 20:54:53.475621] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf5 00:07:37.841 [2024-04-25 20:54:53.475830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8a7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.475855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.475910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.475924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.475979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.475997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.476051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8484837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.476067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.841 [2024-04-25 20:54:53.476122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.841 [2024-04-25 20:54:53.476136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.841 #69 NEW cov: 11999 ft: 15116 corp: 32/686b lim: 30 exec/s: 69 rss: 70Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:38.103 [2024-04-25 20:54:53.515237] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4545 00:07:38.103 [2024-04-25 20:54:53.515355] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000454d 00:07:38.103 [2024-04-25 20:54:53.515554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.515579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.515635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:45458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.515650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.103 #70 NEW cov: 11999 ft: 15124 corp: 33/699b lim: 30 exec/s: 70 rss: 70Mb L: 13/30 MS: 1 ChangeBit- 00:07:38.103 [2024-04-25 20:54:53.555343] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:38.103 [2024-04-25 20:54:53.555643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.555668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.555726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.555740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.103 #71 NEW cov: 11999 ft: 15139 corp: 34/712b lim: 30 exec/s: 71 rss: 70Mb L: 13/30 MS: 1 EraseBytes- 00:07:38.103 [2024-04-25 20:54:53.595538] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:38.103 [2024-04-25 20:54:53.595654] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.103 [2024-04-25 20:54:53.595760] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.103 [2024-04-25 20:54:53.595861] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.103 [2024-04-25 20:54:53.596074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.596100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.596155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.596170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.596224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0a7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.596237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.596293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:3f7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.596309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.103 #72 NEW cov: 11999 ft: 15148 corp: 35/737b lim: 30 exec/s: 72 rss: 70Mb L: 25/30 MS: 1 CopyPart- 00:07:38.103 [2024-04-25 20:54:53.635623] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:38.103 [2024-04-25 20:54:53.635731] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.103 [2024-04-25 20:54:53.635830] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.103 [2024-04-25 20:54:53.635931] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.103 [2024-04-25 20:54:53.636144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.636169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.636229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.636243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.103 [2024-04-25 20:54:53.636298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7a8382 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.103 [2024-04-25 20:54:53.636312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.636367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.636381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.104 #73 NEW cov: 11999 ft: 15165 corp: 36/766b lim: 30 exec/s: 73 rss: 70Mb L: 29/30 MS: 1 CopyPart- 00:07:38.104 [2024-04-25 20:54:53.675696] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000aab 00:07:38.104 [2024-04-25 20:54:53.675810] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.104 [2024-04-25 20:54:53.675916] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f51e 00:07:38.104 [2024-04-25 20:54:53.676138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:167b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.676164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.676219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.676234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.676288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b835b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.676302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.104 #74 NEW cov: 11999 ft: 15189 corp: 37/784b lim: 30 exec/s: 74 rss: 70Mb L: 18/30 MS: 1 ChangeBit- 00:07:38.104 [2024-04-25 20:54:53.715797] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.104 [2024-04-25 20:54:53.715907] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (786432) > buf size (4096) 00:07:38.104 [2024-04-25 20:54:53.716119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.716149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.716204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff022a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.716218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.104 #76 NEW cov: 11999 ft: 15202 corp: 38/797b lim: 30 exec/s: 76 rss: 70Mb L: 13/30 MS: 2 EraseBytes-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:38.104 [2024-04-25 20:54:53.755981] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:38.104 [2024-04-25 20:54:53.756102] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:38.104 [2024-04-25 20:54:53.756206] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:38.104 [2024-04-25 20:54:53.756310] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4545 00:07:38.104 [2024-04-25 20:54:53.756521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.756547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.756604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff0005 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.756619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.756675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00458145 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.756689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.104 [2024-04-25 20:54:53.756746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:45230045 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.104 [2024-04-25 20:54:53.756760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.431 #77 NEW cov: 11999 ft: 15272 corp: 39/821b lim: 30 exec/s: 77 rss: 70Mb L: 24/30 MS: 1 CMP- DE: "\377\377\377\005"- 00:07:38.431 [2024-04-25 20:54:53.806099] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (43052) > buf size (4096) 00:07:38.431 [2024-04-25 20:54:53.806294] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x45 00:07:38.431 [2024-04-25 20:54:53.806396] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100004545 00:07:38.431 [2024-04-25 20:54:53.806600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2a0a00ab cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.806625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.806682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.806696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.806752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.806766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.806825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:4545812c cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.806839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.431 #78 NEW cov: 11999 ft: 15283 corp: 40/847b lim: 30 exec/s: 78 rss: 70Mb L: 26/30 MS: 1 ChangeByte- 00:07:38.431 [2024-04-25 20:54:53.846268] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xab 00:07:38.431 [2024-04-25 20:54:53.846382] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (125996) > buf size (4096) 00:07:38.431 [2024-04-25 20:54:53.846490] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.431 [2024-04-25 20:54:53.846587] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.431 [2024-04-25 20:54:53.846692] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf5 00:07:38.431 [2024-04-25 20:54:53.846913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:167b007b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.846939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.846999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:7b0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.847014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.847067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.847081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.847136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.847150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.847205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.847219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:38.431 #79 NEW cov: 11999 ft: 15298 corp: 41/877b lim: 30 exec/s: 79 rss: 70Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:38.431 [2024-04-25 20:54:53.886399] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.431 [2024-04-25 20:54:53.886512] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff8a 00:07:38.431 [2024-04-25 20:54:53.886619] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007b7b 00:07:38.431 [2024-04-25 20:54:53.886722] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007b7b 00:07:38.431 [2024-04-25 20:54:53.886822] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf5 00:07:38.431 [2024-04-25 20:54:53.887028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.887055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.887122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.887136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.887193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.887207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.887261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7b7b027b cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.887276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.431 [2024-04-25 20:54:53.887331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:7b7b837b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.431 [2024-04-25 20:54:53.887345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:38.432 #80 NEW cov: 11999 ft: 15303 corp: 42/907b lim: 30 exec/s: 40 rss: 70Mb L: 30/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:38.432 #80 DONE cov: 11999 ft: 15303 corp: 42/907b lim: 30 exec/s: 40 rss: 70Mb 00:07:38.432 ###### Recommended dictionary. ###### 00:07:38.432 "\377\377\377\377\377\377\377\377" # Uses: 1 00:07:38.432 "\377\377\377\005" # Uses: 0 00:07:38.432 ###### End of recommended dictionary. ###### 00:07:38.432 Done 80 runs in 2 second(s) 00:07:38.432 20:54:54 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:38.432 20:54:54 -- ../common.sh@72 -- # (( i++ )) 00:07:38.432 20:54:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.432 20:54:54 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:38.432 20:54:54 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:38.432 20:54:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.432 20:54:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.432 20:54:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.432 20:54:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:38.432 20:54:54 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:38.432 20:54:54 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:38.432 20:54:54 -- nvmf/run.sh@34 -- # printf %02d 2 00:07:38.432 20:54:54 -- nvmf/run.sh@34 -- # port=4402 00:07:38.432 20:54:54 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.432 20:54:54 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:38.432 20:54:54 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.432 20:54:54 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:38.432 20:54:54 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:38.432 20:54:54 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:38.432 [2024-04-25 20:54:54.055447] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:38.432 [2024-04-25 20:54:54.055542] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193093 ] 00:07:38.699 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.699 [2024-04-25 20:54:54.198050] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:38.699 [2024-04-25 20:54:54.236844] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.699 [2024-04-25 20:54:54.256267] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.699 [2024-04-25 20:54:54.308504] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.699 [2024-04-25 20:54:54.324818] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:38.699 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.699 INFO: Seed: 3134916876 00:07:38.957 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:38.957 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:38.957 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.957 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.957 #2 INITED exec/s: 0 rss: 60Mb 00:07:38.957 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.957 This may also happen if the target rejected all inputs we tried so far 00:07:38.957 [2024-04-25 20:54:54.401603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.957 [2024-04-25 20:54:54.401638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.957 [2024-04-25 20:54:54.401767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.957 [2024-04-25 20:54:54.401784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.957 [2024-04-25 20:54:54.401914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.957 [2024-04-25 20:54:54.401932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.957 [2024-04-25 20:54:54.402068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.957 [2024-04-25 20:54:54.402087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.216 NEW_FUNC[1/667]: 0x4a7010 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:39.216 NEW_FUNC[2/667]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.216 #28 NEW cov: 11625 ft: 11626 corp: 2/30b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:39.216 [2024-04-25 20:54:54.741708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.216 [2024-04-25 20:54:54.741754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.216 NEW_FUNC[1/3]: 0x171b800 in nvme_complete_register_operations /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:726 00:07:39.216 NEW_FUNC[2/3]: 0x172e960 in nvme_robust_mutex_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1140 00:07:39.216 #32 NEW cov: 11795 ft: 12917 corp: 3/41b lim: 35 exec/s: 0 rss: 68Mb L: 11/29 MS: 4 CopyPart-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:39.216 [2024-04-25 20:54:54.791730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.216 [2024-04-25 20:54:54.791759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.216 #33 NEW cov: 11801 ft: 13195 corp: 4/52b lim: 35 exec/s: 0 rss: 69Mb L: 11/29 MS: 1 ShuffleBytes- 00:07:39.216 [2024-04-25 20:54:54.841878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.216 [2024-04-25 20:54:54.841905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.216 #39 NEW cov: 11886 ft: 13380 corp: 5/59b lim: 35 exec/s: 0 rss: 69Mb L: 7/29 MS: 1 EraseBytes- 00:07:39.475 [2024-04-25 20:54:54.882059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.475 [2024-04-25 20:54:54.882085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.475 #40 NEW cov: 11886 ft: 13520 corp: 6/69b lim: 35 exec/s: 0 rss: 69Mb L: 10/29 MS: 1 EraseBytes- 00:07:39.475 [2024-04-25 20:54:54.922546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.475 [2024-04-25 20:54:54.922571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.476 [2024-04-25 20:54:54.922694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:54.922710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.476 [2024-04-25 20:54:54.922834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:54.922850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.476 #41 NEW cov: 11886 ft: 13843 corp: 7/91b lim: 35 exec/s: 0 rss: 69Mb L: 22/29 MS: 1 EraseBytes- 00:07:39.476 [2024-04-25 20:54:54.972282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5555000a cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:54.972310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.476 #42 NEW cov: 11886 ft: 13932 corp: 8/98b lim: 35 exec/s: 0 rss: 69Mb L: 7/29 MS: 1 CrossOver- 00:07:39.476 [2024-04-25 20:54:55.012458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:55.012485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.476 #43 NEW cov: 11886 ft: 13969 corp: 9/109b lim: 35 exec/s: 0 rss: 69Mb L: 11/29 MS: 1 ChangeBit- 00:07:39.476 [2024-04-25 20:54:55.052462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:39550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:55.052489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.476 #44 NEW cov: 11886 ft: 14018 corp: 10/119b lim: 35 exec/s: 0 rss: 69Mb L: 10/29 MS: 1 ChangeByte- 00:07:39.476 [2024-04-25 20:54:55.092785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:890055fe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:55.092811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.476 [2024-04-25 20:54:55.092929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:bca30012 cdw11:5500af51 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.476 [2024-04-25 20:54:55.092945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.476 #45 NEW cov: 11886 ft: 14254 corp: 11/134b lim: 35 exec/s: 0 rss: 69Mb L: 15/29 MS: 1 CMP- DE: "\376\211\003\022\274\243\257Q"- 00:07:39.735 [2024-04-25 20:54:55.142778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5555000a cdw11:55005529 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.142804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.735 #46 NEW cov: 11886 ft: 14269 corp: 12/141b lim: 35 exec/s: 0 rss: 69Mb L: 7/29 MS: 1 ChangeByte- 00:07:39.735 [2024-04-25 20:54:55.182949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.182975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.735 #47 NEW cov: 11886 ft: 14293 corp: 13/148b lim: 35 exec/s: 0 rss: 69Mb L: 7/29 MS: 1 ShuffleBytes- 00:07:39.735 [2024-04-25 20:54:55.223024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005521 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.223049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.735 #48 NEW cov: 11886 ft: 14338 corp: 14/156b lim: 35 exec/s: 0 rss: 69Mb L: 8/29 MS: 1 InsertByte- 00:07:39.735 [2024-04-25 20:54:55.263085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005557 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.263111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.735 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:39.735 #49 NEW cov: 11909 ft: 14465 corp: 15/167b lim: 35 exec/s: 0 rss: 70Mb L: 11/29 MS: 1 ChangeBit- 00:07:39.735 [2024-04-25 20:54:55.313250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:5555000a cdw11:5500000c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.313276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.735 #50 NEW cov: 11909 ft: 14501 corp: 16/176b lim: 35 exec/s: 0 rss: 70Mb L: 9/29 MS: 1 CMP- DE: "\000\014"- 00:07:39.735 [2024-04-25 20:54:55.353953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.353979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.735 [2024-04-25 20:54:55.354109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.354127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.735 [2024-04-25 20:54:55.354245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.354260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.735 [2024-04-25 20:54:55.354379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.354395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.735 #51 NEW cov: 11909 ft: 14520 corp: 17/206b lim: 35 exec/s: 51 rss: 70Mb L: 30/30 MS: 1 InsertByte- 00:07:39.735 [2024-04-25 20:54:55.393471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55002155 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.735 [2024-04-25 20:54:55.393497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 #52 NEW cov: 11909 ft: 14612 corp: 18/214b lim: 35 exec/s: 52 rss: 70Mb L: 8/30 MS: 1 ShuffleBytes- 00:07:39.995 [2024-04-25 20:54:55.433787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fe890055 cdw11:bc000312 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.433816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 [2024-04-25 20:54:55.433943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:513900af cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.433961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.995 #53 NEW cov: 11909 ft: 14693 corp: 19/232b lim: 35 exec/s: 53 rss: 70Mb L: 18/30 MS: 1 PersAutoDict- DE: "\376\211\003\022\274\243\257Q"- 00:07:39.995 [2024-04-25 20:54:55.473673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005520 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.473699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 #54 NEW cov: 11909 ft: 14704 corp: 20/240b lim: 35 exec/s: 54 rss: 70Mb L: 8/30 MS: 1 ChangeBit- 00:07:39.995 [2024-04-25 20:54:55.513877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:fe890055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.513902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 #55 NEW cov: 11909 ft: 14728 corp: 21/250b lim: 35 exec/s: 55 rss: 70Mb L: 10/30 MS: 1 EraseBytes- 00:07:39.995 [2024-04-25 20:54:55.554055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:0300fe89 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.554081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 [2024-04-25 20:54:55.554209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a3af00bc cdw11:55005155 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.554224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.995 #56 NEW cov: 11909 ft: 14739 corp: 22/265b lim: 35 exec/s: 56 rss: 70Mb L: 15/30 MS: 1 PersAutoDict- DE: "\376\211\003\022\274\243\257Q"- 00:07:39.995 [2024-04-25 20:54:55.594524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.594550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 [2024-04-25 20:54:55.594691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.594708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.995 [2024-04-25 20:54:55.594826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.594844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.995 #57 NEW cov: 11909 ft: 14802 corp: 23/287b lim: 35 exec/s: 57 rss: 70Mb L: 22/30 MS: 1 CopyPart- 00:07:39.995 [2024-04-25 20:54:55.634411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55002155 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.634436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.995 [2024-04-25 20:54:55.634556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.995 [2024-04-25 20:54:55.634572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.254 #58 NEW cov: 11909 ft: 14821 corp: 24/306b lim: 35 exec/s: 58 rss: 70Mb L: 19/30 MS: 1 CrossOver- 00:07:40.254 [2024-04-25 20:54:55.674323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:45005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.674350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.254 #59 NEW cov: 11909 ft: 14825 corp: 25/317b lim: 35 exec/s: 59 rss: 70Mb L: 11/30 MS: 1 ChangeBit- 00:07:40.254 [2024-04-25 20:54:55.714438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:45005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.714465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.254 #60 NEW cov: 11909 ft: 14837 corp: 26/328b lim: 35 exec/s: 60 rss: 70Mb L: 11/30 MS: 1 ShuffleBytes- 00:07:40.254 [2024-04-25 20:54:55.754546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:00005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.754573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.254 #61 NEW cov: 11909 ft: 14853 corp: 27/338b lim: 35 exec/s: 61 rss: 70Mb L: 10/30 MS: 1 ChangeBinInt- 00:07:40.254 [2024-04-25 20:54:55.794722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:5d005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.794749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.254 #62 NEW cov: 11909 ft: 14883 corp: 28/349b lim: 35 exec/s: 62 rss: 70Mb L: 11/30 MS: 1 ChangeByte- 00:07:40.254 [2024-04-25 20:54:55.834872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:5500555d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.834899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.254 #63 NEW cov: 11909 ft: 14912 corp: 29/359b lim: 35 exec/s: 63 rss: 70Mb L: 10/30 MS: 1 ChangeBit- 00:07:40.254 [2024-04-25 20:54:55.874964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.874997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.254 #64 NEW cov: 11909 ft: 14922 corp: 30/369b lim: 35 exec/s: 64 rss: 70Mb L: 10/30 MS: 1 CopyPart- 00:07:40.254 [2024-04-25 20:54:55.915114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:00005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.254 [2024-04-25 20:54:55.915141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.513 #65 NEW cov: 11909 ft: 14933 corp: 31/379b lim: 35 exec/s: 65 rss: 70Mb L: 10/30 MS: 1 CopyPart- 00:07:40.513 [2024-04-25 20:54:55.965286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:39550055 cdw11:55005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:55.965313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.513 #66 NEW cov: 11909 ft: 14960 corp: 32/389b lim: 35 exec/s: 66 rss: 70Mb L: 10/30 MS: 1 ChangeBit- 00:07:40.513 [2024-04-25 20:54:56.005275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:0000555d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.005305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.513 #67 NEW cov: 11909 ft: 15000 corp: 33/399b lim: 35 exec/s: 67 rss: 70Mb L: 10/30 MS: 1 ChangeByte- 00:07:40.513 [2024-04-25 20:54:56.045480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a550055 cdw11:0000555d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.045507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.513 #68 NEW cov: 11909 ft: 15005 corp: 34/409b lim: 35 exec/s: 68 rss: 70Mb L: 10/30 MS: 1 ChangeBinInt- 00:07:40.513 [2024-04-25 20:54:56.085818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55550055 cdw11:0300fe89 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.085845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.513 [2024-04-25 20:54:56.085979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:bca30012 cdw11:5500af51 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.086000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.513 #69 NEW cov: 11909 ft: 15015 corp: 35/425b lim: 35 exec/s: 69 rss: 70Mb L: 16/30 MS: 1 InsertByte- 00:07:40.513 [2024-04-25 20:54:56.126352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.126379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.513 [2024-04-25 20:54:56.126503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.126519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.513 [2024-04-25 20:54:56.126638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:00001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.126657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.513 [2024-04-25 20:54:56.126777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ede100fc cdw11:fc00f2c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.513 [2024-04-25 20:54:56.126795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.513 #70 NEW cov: 11909 ft: 15017 corp: 36/455b lim: 35 exec/s: 70 rss: 70Mb L: 30/30 MS: 1 CMP- DE: "\000v\374\355\341\362\306\374"- 00:07:40.773 [2024-04-25 20:54:56.175875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55ae0055 cdw11:5500aa55 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.175903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.773 #71 NEW cov: 11909 ft: 15020 corp: 37/466b lim: 35 exec/s: 71 rss: 70Mb L: 11/30 MS: 1 ChangeBinInt- 00:07:40.773 [2024-04-25 20:54:56.216632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.216658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.216782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.216799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.216917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.216939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.217070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.217088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.773 #72 NEW cov: 11909 ft: 15025 corp: 38/496b lim: 35 exec/s: 72 rss: 71Mb L: 30/30 MS: 1 CopyPart- 00:07:40.773 [2024-04-25 20:54:56.256663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.256690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.256803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.256820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.256948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.256967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.257094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:1f1f001f cdw11:1f001f1f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.257111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.773 #73 NEW cov: 11909 ft: 15061 corp: 39/525b lim: 35 exec/s: 73 rss: 71Mb L: 29/30 MS: 1 ShuffleBytes- 00:07:40.773 [2024-04-25 20:54:56.296166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:57550055 cdw11:5d005555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.296191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.773 #74 NEW cov: 11909 ft: 15134 corp: 40/536b lim: 35 exec/s: 74 rss: 71Mb L: 11/30 MS: 1 ChangeBinInt- 00:07:40.773 [2024-04-25 20:54:56.336283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:55320055 cdw11:5500aa55 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.336308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.773 #75 NEW cov: 11909 ft: 15135 corp: 41/547b lim: 35 exec/s: 75 rss: 71Mb L: 11/30 MS: 1 ChangeByte- 00:07:40.773 [2024-04-25 20:54:56.376215] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.773 [2024-04-25 20:54:56.376392] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.773 [2024-04-25 20:54:56.376855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.376885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.377006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:55000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.377025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.773 [2024-04-25 20:54:56.377148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:fe890055 cdw11:bc000312 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.773 [2024-04-25 20:54:56.377169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.773 #76 NEW cov: 11918 ft: 15182 corp: 42/574b lim: 35 exec/s: 38 rss: 71Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:40.773 #76 DONE cov: 11918 ft: 15182 corp: 42/574b lim: 35 exec/s: 38 rss: 71Mb 00:07:40.773 ###### Recommended dictionary. ###### 00:07:40.773 "\376\211\003\022\274\243\257Q" # Uses: 2 00:07:40.773 "\000\014" # Uses: 0 00:07:40.773 "\000v\374\355\341\362\306\374" # Uses: 0 00:07:40.773 ###### End of recommended dictionary. ###### 00:07:40.773 Done 76 runs in 2 second(s) 00:07:41.032 20:54:56 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:41.032 20:54:56 -- ../common.sh@72 -- # (( i++ )) 00:07:41.032 20:54:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:41.032 20:54:56 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:41.032 20:54:56 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:41.032 20:54:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:41.032 20:54:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:41.032 20:54:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:41.032 20:54:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:41.032 20:54:56 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:41.032 20:54:56 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:41.032 20:54:56 -- nvmf/run.sh@34 -- # printf %02d 3 00:07:41.032 20:54:56 -- nvmf/run.sh@34 -- # port=4403 00:07:41.033 20:54:56 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:41.033 20:54:56 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:41.033 20:54:56 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:41.033 20:54:56 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:41.033 20:54:56 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:41.033 20:54:56 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:41.033 [2024-04-25 20:54:56.545307] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:41.033 [2024-04-25 20:54:56.545376] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid193632 ] 00:07:41.033 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.033 [2024-04-25 20:54:56.684144] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:41.291 [2024-04-25 20:54:56.720413] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.291 [2024-04-25 20:54:56.739494] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.291 [2024-04-25 20:54:56.791595] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.291 [2024-04-25 20:54:56.807867] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:41.291 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.291 INFO: Seed: 1322932174 00:07:41.291 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:41.291 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:41.291 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:41.291 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.291 #2 INITED exec/s: 0 rss: 60Mb 00:07:41.291 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.291 This may also happen if the target rejected all inputs we tried so far 00:07:41.550 NEW_FUNC[1/659]: 0x4a8ce0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:41.550 NEW_FUNC[2/659]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.550 #5 NEW cov: 11577 ft: 11578 corp: 2/12b lim: 20 exec/s: 0 rss: 68Mb L: 11/11 MS: 3 InsertByte-InsertByte-CMP- DE: "\000v\374\356\324#\3644"- 00:07:41.550 #6 NEW cov: 11707 ft: 12098 corp: 3/23b lim: 20 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeBit- 00:07:41.809 #7 NEW cov: 11730 ft: 12746 corp: 4/42b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:41.809 #8 NEW cov: 11815 ft: 13032 corp: 5/51b lim: 20 exec/s: 0 rss: 68Mb L: 9/19 MS: 1 CrossOver- 00:07:41.809 #12 NEW cov: 11815 ft: 13121 corp: 6/62b lim: 20 exec/s: 0 rss: 68Mb L: 11/19 MS: 4 ShuffleBytes-CrossOver-CMP-PersAutoDict- DE: "\377\377"-"\000v\374\356\324#\3644"- 00:07:41.809 #15 NEW cov: 11815 ft: 13455 corp: 7/66b lim: 20 exec/s: 0 rss: 68Mb L: 4/19 MS: 3 ChangeBit-PersAutoDict-InsertByte- DE: "\377\377"- 00:07:41.809 #16 NEW cov: 11815 ft: 13563 corp: 8/77b lim: 20 exec/s: 0 rss: 68Mb L: 11/19 MS: 1 ChangeBinInt- 00:07:41.809 #17 NEW cov: 11815 ft: 13661 corp: 9/88b lim: 20 exec/s: 0 rss: 69Mb L: 11/19 MS: 1 ChangeBinInt- 00:07:41.809 #18 NEW cov: 11815 ft: 13756 corp: 10/108b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 InsertByte- 00:07:42.068 #19 NEW cov: 11815 ft: 13856 corp: 11/128b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:42.068 #20 NEW cov: 11815 ft: 13896 corp: 12/132b lim: 20 exec/s: 0 rss: 69Mb L: 4/20 MS: 1 ShuffleBytes- 00:07:42.068 #21 NEW cov: 11815 ft: 13924 corp: 13/152b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 ChangeBit- 00:07:42.068 #23 NEW cov: 11819 ft: 14078 corp: 14/166b lim: 20 exec/s: 0 rss: 69Mb L: 14/20 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:42.068 #24 NEW cov: 11819 ft: 14098 corp: 15/185b lim: 20 exec/s: 0 rss: 69Mb L: 19/20 MS: 1 ShuffleBytes- 00:07:42.068 #25 NEW cov: 11819 ft: 14119 corp: 16/197b lim: 20 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 InsertByte- 00:07:42.326 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:42.326 #27 NEW cov: 11842 ft: 14167 corp: 17/207b lim: 20 exec/s: 0 rss: 69Mb L: 10/20 MS: 2 ChangeBinInt-InsertByte- 00:07:42.326 #28 NEW cov: 11842 ft: 14183 corp: 18/218b lim: 20 exec/s: 0 rss: 70Mb L: 11/20 MS: 1 ChangeByte- 00:07:42.326 #29 NEW cov: 11842 ft: 14206 corp: 19/237b lim: 20 exec/s: 29 rss: 70Mb L: 19/20 MS: 1 InsertRepeatedBytes- 00:07:42.326 #30 NEW cov: 11842 ft: 14222 corp: 20/248b lim: 20 exec/s: 30 rss: 70Mb L: 11/20 MS: 1 ChangeBit- 00:07:42.326 #31 NEW cov: 11842 ft: 14238 corp: 21/268b lim: 20 exec/s: 31 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:42.326 NEW_FUNC[1/4]: 0x11822c0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3283 00:07:42.326 NEW_FUNC[2/4]: 0x1182e40 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3225 00:07:42.326 #32 NEW cov: 11924 ft: 14358 corp: 22/287b lim: 20 exec/s: 32 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:42.586 #33 NEW cov: 11924 ft: 14367 corp: 23/307b lim: 20 exec/s: 33 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:42.586 #34 NEW cov: 11924 ft: 14388 corp: 24/311b lim: 20 exec/s: 34 rss: 70Mb L: 4/20 MS: 1 ChangeBinInt- 00:07:42.586 #35 NEW cov: 11924 ft: 14398 corp: 25/321b lim: 20 exec/s: 35 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:07:42.586 #36 NEW cov: 11924 ft: 14412 corp: 26/325b lim: 20 exec/s: 36 rss: 70Mb L: 4/20 MS: 1 ChangeBit- 00:07:42.586 #37 NEW cov: 11924 ft: 14427 corp: 27/342b lim: 20 exec/s: 37 rss: 70Mb L: 17/20 MS: 1 InsertRepeatedBytes- 00:07:42.586 #38 NEW cov: 11924 ft: 14458 corp: 28/354b lim: 20 exec/s: 38 rss: 70Mb L: 12/20 MS: 1 PersAutoDict- DE: "\000v\374\356\324#\3644"- 00:07:42.845 #39 NEW cov: 11924 ft: 14488 corp: 29/364b lim: 20 exec/s: 39 rss: 70Mb L: 10/20 MS: 1 ChangeByte- 00:07:42.845 #40 NEW cov: 11924 ft: 14496 corp: 30/381b lim: 20 exec/s: 40 rss: 70Mb L: 17/20 MS: 1 CopyPart- 00:07:42.845 #41 NEW cov: 11924 ft: 14548 corp: 31/391b lim: 20 exec/s: 41 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:07:42.845 #42 NEW cov: 11924 ft: 14556 corp: 32/401b lim: 20 exec/s: 42 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:07:42.845 #43 NEW cov: 11924 ft: 14571 corp: 33/420b lim: 20 exec/s: 43 rss: 70Mb L: 19/20 MS: 1 CMP- DE: "\000\000"- 00:07:42.845 #44 NEW cov: 11924 ft: 14575 corp: 34/432b lim: 20 exec/s: 44 rss: 70Mb L: 12/20 MS: 1 ChangeByte- 00:07:42.845 #45 NEW cov: 11924 ft: 14646 corp: 35/443b lim: 20 exec/s: 45 rss: 70Mb L: 11/20 MS: 1 ChangeASCIIInt- 00:07:43.105 #46 NEW cov: 11924 ft: 14676 corp: 36/455b lim: 20 exec/s: 46 rss: 70Mb L: 12/20 MS: 1 InsertByte- 00:07:43.105 #47 NEW cov: 11924 ft: 14681 corp: 37/472b lim: 20 exec/s: 47 rss: 70Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:43.105 #48 NEW cov: 11924 ft: 14718 corp: 38/492b lim: 20 exec/s: 48 rss: 70Mb L: 20/20 MS: 1 CrossOver- 00:07:43.105 #49 NEW cov: 11924 ft: 14743 corp: 39/507b lim: 20 exec/s: 49 rss: 70Mb L: 15/20 MS: 1 InsertByte- 00:07:43.105 #50 NEW cov: 11924 ft: 14749 corp: 40/514b lim: 20 exec/s: 50 rss: 70Mb L: 7/20 MS: 1 CrossOver- 00:07:43.105 #51 NEW cov: 11924 ft: 14778 corp: 41/524b lim: 20 exec/s: 51 rss: 70Mb L: 10/20 MS: 1 ChangeByte- 00:07:43.366 #52 NEW cov: 11924 ft: 14788 corp: 42/544b lim: 20 exec/s: 52 rss: 70Mb L: 20/20 MS: 1 ChangeBit- 00:07:43.366 #53 NEW cov: 11924 ft: 14794 corp: 43/556b lim: 20 exec/s: 53 rss: 70Mb L: 12/20 MS: 1 CrossOver- 00:07:43.366 #54 NEW cov: 11924 ft: 14811 corp: 44/569b lim: 20 exec/s: 27 rss: 70Mb L: 13/20 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:43.366 #54 DONE cov: 11924 ft: 14811 corp: 44/569b lim: 20 exec/s: 27 rss: 70Mb 00:07:43.366 ###### Recommended dictionary. ###### 00:07:43.366 "\000v\374\356\324#\3644" # Uses: 2 00:07:43.366 "\377\377" # Uses: 1 00:07:43.366 "\000\000" # Uses: 1 00:07:43.366 ###### End of recommended dictionary. ###### 00:07:43.366 Done 54 runs in 2 second(s) 00:07:43.366 20:54:58 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:43.366 20:54:58 -- ../common.sh@72 -- # (( i++ )) 00:07:43.366 20:54:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:43.366 20:54:58 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:43.366 20:54:58 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:43.366 20:54:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:43.366 20:54:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:43.366 20:54:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.366 20:54:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:43.366 20:54:58 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:43.366 20:54:58 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:43.366 20:54:58 -- nvmf/run.sh@34 -- # printf %02d 4 00:07:43.366 20:54:58 -- nvmf/run.sh@34 -- # port=4404 00:07:43.366 20:54:58 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.366 20:54:58 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:43.366 20:54:58 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:43.366 20:54:58 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:43.366 20:54:58 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:43.366 20:54:58 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:43.366 [2024-04-25 20:54:59.014223] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:43.366 [2024-04-25 20:54:59.014296] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194011 ] 00:07:43.626 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.626 [2024-04-25 20:54:59.163199] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:43.626 [2024-04-25 20:54:59.200054] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.626 [2024-04-25 20:54:59.219268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.626 [2024-04-25 20:54:59.271444] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.626 [2024-04-25 20:54:59.287772] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:43.884 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.884 INFO: Seed: 3802933377 00:07:43.884 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:43.884 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:43.884 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.884 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.884 #2 INITED exec/s: 0 rss: 60Mb 00:07:43.884 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.884 This may also happen if the target rejected all inputs we tried so far 00:07:43.884 [2024-04-25 20:54:59.342979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-04-25 20:54:59.343012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 NEW_FUNC[1/670]: 0x4a9dd0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:44.144 NEW_FUNC[2/670]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.144 #24 NEW cov: 11678 ft: 11679 corp: 2/9b lim: 35 exec/s: 0 rss: 68Mb L: 8/8 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:44.144 [2024-04-25 20:54:59.643830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.643860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 NEW_FUNC[1/1]: 0x1a1c370 in sock_group_impl_poll_count /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/sock/sock.c:712 00:07:44.144 #25 NEW cov: 11816 ft: 12303 corp: 3/17b lim: 35 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 ChangeBit- 00:07:44.144 [2024-04-25 20:54:59.694073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a40a0aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.694098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-04-25 20:54:59.694156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a52ba4a4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.694170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 #26 NEW cov: 11822 ft: 13287 corp: 4/33b lim: 35 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 CrossOver- 00:07:44.144 [2024-04-25 20:54:59.734475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a40a0aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.734502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-04-25 20:54:59.734562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a553a4a4 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.734576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 [2024-04-25 20:54:59.734635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.734649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 [2024-04-25 20:54:59.734704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:53535353 cdw11:2ba40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.734722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.144 #27 NEW cov: 11907 ft: 13872 corp: 5/64b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:44.144 [2024-04-25 20:54:59.784179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a4a40a cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-04-25 20:54:59.784204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 #28 NEW cov: 11907 ft: 14073 corp: 6/72b lim: 35 exec/s: 0 rss: 68Mb L: 8/31 MS: 1 ShuffleBytes- 00:07:44.404 [2024-04-25 20:54:59.824734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a40a0aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.824760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.824818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a52ba4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.824832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.824889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.824902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.824958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.824971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.404 #34 NEW cov: 11907 ft: 14146 corp: 7/100b lim: 35 exec/s: 0 rss: 68Mb L: 28/31 MS: 1 CrossOver- 00:07:44.404 [2024-04-25 20:54:59.864530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0aa4a40a cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.864555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.864611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.864625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.404 #35 NEW cov: 11907 ft: 14279 corp: 8/114b lim: 35 exec/s: 0 rss: 69Mb L: 14/31 MS: 1 CopyPart- 00:07:44.404 [2024-04-25 20:54:59.914465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.914489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.404 #41 NEW cov: 11907 ft: 14322 corp: 9/123b lim: 35 exec/s: 0 rss: 69Mb L: 9/31 MS: 1 InsertByte- 00:07:44.404 [2024-04-25 20:54:59.954552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.954577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.404 #42 NEW cov: 11907 ft: 14359 corp: 10/132b lim: 35 exec/s: 0 rss: 69Mb L: 9/31 MS: 1 InsertByte- 00:07:44.404 [2024-04-25 20:54:59.995357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.995382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.995443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00002b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.995458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.995515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.995529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.995585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.995598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:54:59.995653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:54:59.995667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.404 #43 NEW cov: 11907 ft: 14434 corp: 11/167b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:44.404 [2024-04-25 20:55:00.035028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:55:00.035054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.404 [2024-04-25 20:55:00.035114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.404 [2024-04-25 20:55:00.035128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.404 #44 NEW cov: 11907 ft: 14479 corp: 12/187b lim: 35 exec/s: 0 rss: 69Mb L: 20/35 MS: 1 InsertRepeatedBytes- 00:07:44.665 [2024-04-25 20:55:00.074932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.074959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.665 #45 NEW cov: 11907 ft: 14511 corp: 13/194b lim: 35 exec/s: 0 rss: 69Mb L: 7/35 MS: 1 EraseBytes- 00:07:44.665 [2024-04-25 20:55:00.115265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a40a4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.115292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.115349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.115363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.665 #46 NEW cov: 11907 ft: 14536 corp: 14/209b lim: 35 exec/s: 0 rss: 69Mb L: 15/35 MS: 1 InsertByte- 00:07:44.665 [2024-04-25 20:55:00.165250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.165276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.665 #47 NEW cov: 11907 ft: 14656 corp: 15/218b lim: 35 exec/s: 0 rss: 69Mb L: 9/35 MS: 1 CopyPart- 00:07:44.665 [2024-04-25 20:55:00.205896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.205928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.205987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.206007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.206064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.206079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.206136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.206149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.665 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.665 #50 NEW cov: 11930 ft: 14710 corp: 16/249b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:07:44.665 [2024-04-25 20:55:00.246149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.246175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.246243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00002b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.246257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.246313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.246327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.246382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.246396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.246451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.246465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.665 #51 NEW cov: 11930 ft: 14734 corp: 17/284b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:07:44.665 [2024-04-25 20:55:00.296131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a50aa4 cdw11:2b0a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.296156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.296216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a4a4a4a4 cdw11:a5530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.296230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.296290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:53535353 cdw11:53530002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.296304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.665 [2024-04-25 20:55:00.296360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:53535353 cdw11:53530000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.665 [2024-04-25 20:55:00.296374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.665 #52 NEW cov: 11930 ft: 14759 corp: 18/317b lim: 35 exec/s: 52 rss: 69Mb L: 33/35 MS: 1 CrossOver- 00:07:44.925 [2024-04-25 20:55:00.345949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a1f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.345975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.346036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.346051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.925 #53 NEW cov: 11930 ft: 14812 corp: 19/337b lim: 35 exec/s: 53 rss: 70Mb L: 20/35 MS: 1 ChangeBit- 00:07:44.925 [2024-04-25 20:55:00.396434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a40a0aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.396460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.396518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a52ba4a4 cdw11:a4ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.396533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.396588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.396602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.396657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.396670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.925 #54 NEW cov: 11930 ft: 14866 corp: 20/365b lim: 35 exec/s: 54 rss: 70Mb L: 28/35 MS: 1 ChangeByte- 00:07:44.925 [2024-04-25 20:55:00.446546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0aa4a40a cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.446573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.446630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.446645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.446717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:5b5b5b5b cdw11:5b5b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.446732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.925 [2024-04-25 20:55:00.446788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:5b5b5b5b cdw11:5b5b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.925 [2024-04-25 20:55:00.446805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.926 #55 NEW cov: 11930 ft: 14937 corp: 21/396b lim: 35 exec/s: 55 rss: 70Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:44.926 [2024-04-25 20:55:00.486461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.486487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.926 [2024-04-25 20:55:00.486547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:640f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.486562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.926 [2024-04-25 20:55:00.486621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.486636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.926 #56 NEW cov: 11930 ft: 15149 corp: 22/417b lim: 35 exec/s: 56 rss: 70Mb L: 21/35 MS: 1 InsertByte- 00:07:44.926 [2024-04-25 20:55:00.526297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a1f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.526323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.926 #57 NEW cov: 11930 ft: 15264 corp: 23/430b lim: 35 exec/s: 57 rss: 70Mb L: 13/35 MS: 1 EraseBytes- 00:07:44.926 [2024-04-25 20:55:00.566805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.566831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.926 [2024-04-25 20:55:00.566888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:640f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.566913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.926 [2024-04-25 20:55:00.566973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.926 [2024-04-25 20:55:00.566986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.186 #58 NEW cov: 11930 ft: 15271 corp: 24/451b lim: 35 exec/s: 58 rss: 70Mb L: 21/35 MS: 1 ShuffleBytes- 00:07:45.186 [2024-04-25 20:55:00.617061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.617087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.617145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.617159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.617215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.617229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.617287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.617300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.186 #59 NEW cov: 11930 ft: 15288 corp: 25/482b lim: 35 exec/s: 59 rss: 70Mb L: 31/35 MS: 1 CMP- DE: "\000\001"- 00:07:45.186 [2024-04-25 20:55:00.667144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.667170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.667229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00002b00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.667244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.667304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.667319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.667376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.667390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.186 #60 NEW cov: 11930 ft: 15299 corp: 26/516b lim: 35 exec/s: 60 rss: 70Mb L: 34/35 MS: 1 EraseBytes- 00:07:45.186 [2024-04-25 20:55:00.716998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.717023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.717083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffa4ffff cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.717097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.186 #61 NEW cov: 11930 ft: 15328 corp: 27/534b lim: 35 exec/s: 61 rss: 70Mb L: 18/35 MS: 1 InsertRepeatedBytes- 00:07:45.186 [2024-04-25 20:55:00.767160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4360aa4 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.767187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.767247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.767262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.186 #67 NEW cov: 11930 ft: 15345 corp: 28/552b lim: 35 exec/s: 67 rss: 70Mb L: 18/35 MS: 1 InsertRepeatedBytes- 00:07:45.186 [2024-04-25 20:55:00.817462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0001 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.817487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.817546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:640f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.817563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.186 [2024-04-25 20:55:00.817622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.186 [2024-04-25 20:55:00.817636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.186 #68 NEW cov: 11930 ft: 15355 corp: 29/573b lim: 35 exec/s: 68 rss: 70Mb L: 21/35 MS: 1 PersAutoDict- DE: "\000\001"- 00:07:45.447 [2024-04-25 20:55:00.857728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00a40a00 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.857753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.857811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.857824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.857882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.857896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.857951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.857965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.447 #69 NEW cov: 11930 ft: 15373 corp: 30/606b lim: 35 exec/s: 69 rss: 70Mb L: 33/35 MS: 1 CopyPart- 00:07:45.447 [2024-04-25 20:55:00.898011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.898037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.898097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00002b00 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.898111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.898168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.898182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.898238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.898251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.898306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.898321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.447 #70 NEW cov: 11930 ft: 15397 corp: 31/641b lim: 35 exec/s: 70 rss: 70Mb L: 35/35 MS: 1 PersAutoDict- DE: "\000\001"- 00:07:45.447 [2024-04-25 20:55:00.937472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a0f cdw11:0fa40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.937498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 #71 NEW cov: 11930 ft: 15404 corp: 32/653b lim: 35 exec/s: 71 rss: 70Mb L: 12/35 MS: 1 EraseBytes- 00:07:45.447 [2024-04-25 20:55:00.978082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a40a0aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.978107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.978166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a52ba4a4 cdw11:a4fe0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.978180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.978236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.978250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:00.978306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:00.978319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.447 #72 NEW cov: 11930 ft: 15425 corp: 33/681b lim: 35 exec/s: 72 rss: 70Mb L: 28/35 MS: 1 ChangeBit- 00:07:45.447 [2024-04-25 20:55:01.028385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.028410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.028467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:4b4b0000 cdw11:4b4b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.028481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.028538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.028552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.028605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.028619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.028677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00b50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.028690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.447 #73 NEW cov: 11930 ft: 15430 corp: 34/716b lim: 35 exec/s: 73 rss: 70Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:45.447 [2024-04-25 20:55:01.067990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a40a4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.068019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.068075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a400a4a4 cdw11:01a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.068092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.447 #74 NEW cov: 11930 ft: 15458 corp: 35/731b lim: 35 exec/s: 74 rss: 70Mb L: 15/35 MS: 1 PersAutoDict- DE: "\000\001"- 00:07:45.447 [2024-04-25 20:55:01.108322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0001 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.108347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.108407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:640f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.108421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.447 [2024-04-25 20:55:01.108479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.447 [2024-04-25 20:55:01.108493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.708 #75 NEW cov: 11930 ft: 15470 corp: 36/752b lim: 35 exec/s: 75 rss: 70Mb L: 21/35 MS: 1 ShuffleBytes- 00:07:45.708 [2024-04-25 20:55:01.158422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0001 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.158447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.158506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:64000f0f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.158520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.158576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000100 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.158590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.708 #76 NEW cov: 11930 ft: 15492 corp: 37/773b lim: 35 exec/s: 76 rss: 70Mb L: 21/35 MS: 1 CMP- DE: "\000\000\000\000\001\000\000\000"- 00:07:45.708 [2024-04-25 20:55:01.198389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.198415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.198474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.198488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.708 #77 NEW cov: 11930 ft: 15513 corp: 38/793b lim: 35 exec/s: 77 rss: 70Mb L: 20/35 MS: 1 ChangeByte- 00:07:45.708 [2024-04-25 20:55:01.238455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a4a40aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.238481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.238540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.238555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.708 #78 NEW cov: 11930 ft: 15528 corp: 39/812b lim: 35 exec/s: 78 rss: 70Mb L: 19/35 MS: 1 EraseBytes- 00:07:45.708 [2024-04-25 20:55:01.278887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0f0f0a1f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.278911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.278970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.278984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.279047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.279061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.279115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0f0f000f cdw11:0f0f0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.279129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.708 #79 NEW cov: 11930 ft: 15532 corp: 40/843b lim: 35 exec/s: 79 rss: 71Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:45.708 [2024-04-25 20:55:01.329292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a40a0aa4 cdw11:a4a40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.329318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.329374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a52ba4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.329388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.329445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4a50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.329458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.329514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:a4a4a4a4 cdw11:a4a40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.329527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.708 [2024-04-25 20:55:01.329585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:16161616 cdw11:16160000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.708 [2024-04-25 20:55:01.329599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.708 #80 NEW cov: 11930 ft: 15533 corp: 41/878b lim: 35 exec/s: 40 rss: 71Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:45.708 #80 DONE cov: 11930 ft: 15533 corp: 41/878b lim: 35 exec/s: 40 rss: 71Mb 00:07:45.708 ###### Recommended dictionary. ###### 00:07:45.708 "\000\001" # Uses: 3 00:07:45.708 "\000\000\000\000\001\000\000\000" # Uses: 0 00:07:45.708 ###### End of recommended dictionary. ###### 00:07:45.708 Done 80 runs in 2 second(s) 00:07:45.968 20:55:01 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:45.968 20:55:01 -- ../common.sh@72 -- # (( i++ )) 00:07:45.968 20:55:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.968 20:55:01 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:45.968 20:55:01 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:45.968 20:55:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.968 20:55:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.968 20:55:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.968 20:55:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:45.968 20:55:01 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:45.968 20:55:01 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:45.968 20:55:01 -- nvmf/run.sh@34 -- # printf %02d 5 00:07:45.968 20:55:01 -- nvmf/run.sh@34 -- # port=4405 00:07:45.968 20:55:01 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.968 20:55:01 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:45.968 20:55:01 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.968 20:55:01 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:45.968 20:55:01 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:45.968 20:55:01 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:45.968 [2024-04-25 20:55:01.493898] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:45.968 [2024-04-25 20:55:01.493967] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194453 ] 00:07:45.968 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.229 [2024-04-25 20:55:01.643098] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:46.229 [2024-04-25 20:55:01.679013] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.229 [2024-04-25 20:55:01.698161] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.229 [2024-04-25 20:55:01.750257] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.229 [2024-04-25 20:55:01.766595] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:46.229 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.229 INFO: Seed: 1986964342 00:07:46.229 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:46.229 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:46.229 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:46.229 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.229 #2 INITED exec/s: 0 rss: 60Mb 00:07:46.229 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.229 This may also happen if the target rejected all inputs we tried so far 00:07:46.229 [2024-04-25 20:55:01.815444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.229 [2024-04-25 20:55:01.815472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.489 NEW_FUNC[1/670]: 0x4abf60 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:46.489 NEW_FUNC[2/670]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.489 #13 NEW cov: 11688 ft: 11689 corp: 2/16b lim: 45 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:07:46.489 [2024-04-25 20:55:02.116266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.489 [2024-04-25 20:55:02.116304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.489 [2024-04-25 20:55:02.116369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:abab3030 cdw11:abab0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.489 [2024-04-25 20:55:02.116387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.489 NEW_FUNC[1/1]: 0x176aff0 in nvme_get_transport /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:56 00:07:46.489 #14 NEW cov: 11827 ft: 12940 corp: 3/42b lim: 45 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:46.748 [2024-04-25 20:55:02.166138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.166164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.748 #15 NEW cov: 11833 ft: 13213 corp: 4/57b lim: 45 exec/s: 0 rss: 68Mb L: 15/26 MS: 1 CrossOver- 00:07:46.748 [2024-04-25 20:55:02.206223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.206250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.748 #16 NEW cov: 11918 ft: 13424 corp: 5/72b lim: 45 exec/s: 0 rss: 68Mb L: 15/26 MS: 1 ChangeASCIIInt- 00:07:46.748 [2024-04-25 20:55:02.246527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.246552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.748 [2024-04-25 20:55:02.246605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:abab3030 cdw11:abab0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.246618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.748 #17 NEW cov: 11918 ft: 13494 corp: 6/98b lim: 45 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 ChangeASCIIInt- 00:07:46.748 [2024-04-25 20:55:02.286477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.286502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.748 #18 NEW cov: 11918 ft: 13525 corp: 7/108b lim: 45 exec/s: 0 rss: 68Mb L: 10/26 MS: 1 EraseBytes- 00:07:46.748 [2024-04-25 20:55:02.316559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.316584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.748 #19 NEW cov: 11918 ft: 13574 corp: 8/123b lim: 45 exec/s: 0 rss: 68Mb L: 15/26 MS: 1 ChangeASCIIInt- 00:07:46.748 [2024-04-25 20:55:02.346652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:000a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.748 [2024-04-25 20:55:02.346677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.748 #23 NEW cov: 11918 ft: 13631 corp: 9/132b lim: 45 exec/s: 0 rss: 69Mb L: 9/26 MS: 4 CrossOver-CrossOver-CrossOver-InsertRepeatedBytes- 00:07:46.749 [2024-04-25 20:55:02.386760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.749 [2024-04-25 20:55:02.386785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.749 #24 NEW cov: 11918 ft: 13759 corp: 10/147b lim: 45 exec/s: 0 rss: 69Mb L: 15/26 MS: 1 CopyPart- 00:07:47.009 [2024-04-25 20:55:02.416858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.416883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.009 #25 NEW cov: 11918 ft: 13833 corp: 11/157b lim: 45 exec/s: 0 rss: 69Mb L: 10/26 MS: 1 CopyPart- 00:07:47.009 [2024-04-25 20:55:02.457161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.457186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.009 [2024-04-25 20:55:02.457236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.457248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.009 #26 NEW cov: 11918 ft: 13909 corp: 12/178b lim: 45 exec/s: 0 rss: 69Mb L: 21/26 MS: 1 CrossOver- 00:07:47.009 [2024-04-25 20:55:02.497234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30307630 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.497258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.009 [2024-04-25 20:55:02.497310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:abab3030 cdw11:abab0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.497323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.009 #27 NEW cov: 11918 ft: 13937 corp: 13/204b lim: 45 exec/s: 0 rss: 69Mb L: 26/26 MS: 1 ChangeByte- 00:07:47.009 [2024-04-25 20:55:02.537185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.537210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.009 #28 NEW cov: 11918 ft: 13980 corp: 14/219b lim: 45 exec/s: 0 rss: 69Mb L: 15/26 MS: 1 ChangeBinInt- 00:07:47.009 [2024-04-25 20:55:02.577292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9f9f9f9f cdw11:9f9f0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.577317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.009 #30 NEW cov: 11918 ft: 13984 corp: 15/234b lim: 45 exec/s: 0 rss: 69Mb L: 15/26 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:47.009 [2024-04-25 20:55:02.617750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9f9f9f9f cdw11:9f9f0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.617774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.009 [2024-04-25 20:55:02.617826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9f309f9f cdw11:9f300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.617839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.009 [2024-04-25 20:55:02.617889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.617903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.009 #31 NEW cov: 11918 ft: 14235 corp: 16/263b lim: 45 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CrossOver- 00:07:47.009 [2024-04-25 20:55:02.657534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.009 [2024-04-25 20:55:02.657559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 #32 NEW cov: 11918 ft: 14253 corp: 17/278b lim: 45 exec/s: 0 rss: 69Mb L: 15/29 MS: 1 ChangeASCIIInt- 00:07:47.269 [2024-04-25 20:55:02.697711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30373030 cdw11:33370001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.697737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.269 #33 NEW cov: 11941 ft: 14290 corp: 18/293b lim: 45 exec/s: 0 rss: 69Mb L: 15/29 MS: 1 ChangeASCIIInt- 00:07:47.269 [2024-04-25 20:55:02.737772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.737797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 #34 NEW cov: 11941 ft: 14304 corp: 19/308b lim: 45 exec/s: 0 rss: 69Mb L: 15/29 MS: 1 CopyPart- 00:07:47.269 [2024-04-25 20:55:02.778211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.778236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 [2024-04-25 20:55:02.778290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:0a300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.778303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.269 [2024-04-25 20:55:02.778353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:35353039 cdw11:31360001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.778367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.269 #35 NEW cov: 11941 ft: 14329 corp: 20/335b lim: 45 exec/s: 35 rss: 70Mb L: 27/29 MS: 1 CrossOver- 00:07:47.269 [2024-04-25 20:55:02.818031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30302130 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.818056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 #36 NEW cov: 11941 ft: 14348 corp: 21/345b lim: 45 exec/s: 36 rss: 70Mb L: 10/29 MS: 1 ChangeByte- 00:07:47.269 [2024-04-25 20:55:02.858124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.858149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 #37 NEW cov: 11941 ft: 14377 corp: 22/355b lim: 45 exec/s: 37 rss: 70Mb L: 10/29 MS: 1 ChangeByte- 00:07:47.269 [2024-04-25 20:55:02.888491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.888516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.269 [2024-04-25 20:55:02.888565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:abab3030 cdw11:abab0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.888578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.269 [2024-04-25 20:55:02.888632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:abababab cdw11:ab300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.888646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.269 #38 NEW cov: 11941 ft: 14410 corp: 23/382b lim: 45 exec/s: 38 rss: 70Mb L: 27/29 MS: 1 InsertByte- 00:07:47.269 [2024-04-25 20:55:02.928339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.269 [2024-04-25 20:55:02.928363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 #39 NEW cov: 11941 ft: 14424 corp: 24/398b lim: 45 exec/s: 39 rss: 70Mb L: 16/29 MS: 1 InsertByte- 00:07:47.530 [2024-04-25 20:55:02.969065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:02.969089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:02.969141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:02.969154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:02.969206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:02.969218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:02.969269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:02.969281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:02.969333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:30302e30 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:02.969347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.530 #40 NEW cov: 11941 ft: 14796 corp: 25/443b lim: 45 exec/s: 40 rss: 70Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:47.530 [2024-04-25 20:55:03.019207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:51513030 cdw11:51510002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.019231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.019284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:51515151 cdw11:51510002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.019297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.019349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:51515151 cdw11:51510002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.019362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.019413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:51515151 cdw11:51300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.019425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.019477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:30300a30 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.019491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.530 #41 NEW cov: 11941 ft: 14883 corp: 26/488b lim: 45 exec/s: 41 rss: 70Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:47.530 [2024-04-25 20:55:03.058696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.058721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 #42 NEW cov: 11941 ft: 14895 corp: 27/497b lim: 45 exec/s: 42 rss: 70Mb L: 9/45 MS: 1 EraseBytes- 00:07:47.530 [2024-04-25 20:55:03.089089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30603030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.089113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.089165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:abab3030 cdw11:abab0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.089179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.089231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:abababab cdw11:ab300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.089245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.530 #43 NEW cov: 11941 ft: 14917 corp: 28/524b lim: 45 exec/s: 43 rss: 70Mb L: 27/45 MS: 1 ChangeByte- 00:07:47.530 [2024-04-25 20:55:03.128935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:9f9f9f9f cdw11:9f9f0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.128959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 #44 NEW cov: 11941 ft: 14923 corp: 29/539b lim: 45 exec/s: 44 rss: 70Mb L: 15/45 MS: 1 CrossOver- 00:07:47.530 [2024-04-25 20:55:03.169196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.169221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.530 [2024-04-25 20:55:03.169272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.530 [2024-04-25 20:55:03.169286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.790 #45 NEW cov: 11941 ft: 14939 corp: 30/560b lim: 45 exec/s: 45 rss: 70Mb L: 21/45 MS: 1 ShuffleBytes- 00:07:47.790 [2024-04-25 20:55:03.209262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.209286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 [2024-04-25 20:55:03.209339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:47300000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.209352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.790 #46 NEW cov: 11941 ft: 14952 corp: 31/583b lim: 45 exec/s: 46 rss: 70Mb L: 23/45 MS: 1 InsertRepeatedBytes- 00:07:47.790 [2024-04-25 20:55:03.249251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.249275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 #47 NEW cov: 11941 ft: 14980 corp: 32/598b lim: 45 exec/s: 47 rss: 70Mb L: 15/45 MS: 1 ChangeASCIIInt- 00:07:47.790 [2024-04-25 20:55:03.289309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:1e9cf2dc cdw11:e47f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.289334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 #48 NEW cov: 11941 ft: 14991 corp: 33/613b lim: 45 exec/s: 48 rss: 70Mb L: 15/45 MS: 1 CMP- DE: "\362\334\036\234\344\177\000\000"- 00:07:47.790 [2024-04-25 20:55:03.329464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00300a00 cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.329489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 #49 NEW cov: 11941 ft: 15001 corp: 34/622b lim: 45 exec/s: 49 rss: 70Mb L: 9/45 MS: 1 ShuffleBytes- 00:07:47.790 [2024-04-25 20:55:03.369739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.369763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 [2024-04-25 20:55:03.369816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1e9cf2dc cdw11:e47f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.369829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.790 #50 NEW cov: 11941 ft: 15023 corp: 35/645b lim: 45 exec/s: 50 rss: 70Mb L: 23/45 MS: 1 PersAutoDict- DE: "\362\334\036\234\344\177\000\000"- 00:07:47.790 [2024-04-25 20:55:03.409857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.409881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 [2024-04-25 20:55:03.409933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.409947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.790 #51 NEW cov: 11941 ft: 15078 corp: 36/670b lim: 45 exec/s: 51 rss: 70Mb L: 25/45 MS: 1 CrossOver- 00:07:47.790 [2024-04-25 20:55:03.450202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30603030 cdw11:30d90006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.450228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.790 [2024-04-25 20:55:03.450283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:abab3030 cdw11:abab0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.450297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.790 [2024-04-25 20:55:03.450347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:abababab cdw11:ab300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.790 [2024-04-25 20:55:03.450361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.049 #52 NEW cov: 11941 ft: 15082 corp: 37/697b lim: 45 exec/s: 52 rss: 70Mb L: 27/45 MS: 1 ChangeBinInt- 00:07:48.049 [2024-04-25 20:55:03.490430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:a8a824a8 cdw11:a8a80005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.049 [2024-04-25 20:55:03.490456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.049 [2024-04-25 20:55:03.490508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:a8a8a8a8 cdw11:a8a80005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.049 [2024-04-25 20:55:03.490521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.049 [2024-04-25 20:55:03.490572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:a8a8a8a8 cdw11:a8a80005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.049 [2024-04-25 20:55:03.490586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.049 [2024-04-25 20:55:03.490636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:a8a8a8a8 cdw11:a8a80005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.490649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.050 #56 NEW cov: 11941 ft: 15097 corp: 38/739b lim: 45 exec/s: 56 rss: 70Mb L: 42/45 MS: 4 ChangeByte-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:07:48.050 [2024-04-25 20:55:03.530219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.530243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.050 [2024-04-25 20:55:03.530294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.530308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.050 #57 NEW cov: 11941 ft: 15131 corp: 39/760b lim: 45 exec/s: 57 rss: 70Mb L: 21/45 MS: 1 ShuffleBytes- 00:07:48.050 [2024-04-25 20:55:03.570167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.570191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.050 #58 NEW cov: 11941 ft: 15138 corp: 40/769b lim: 45 exec/s: 58 rss: 70Mb L: 9/45 MS: 1 CMP- DE: "\001\000\000\000\000\000\004\000"- 00:07:48.050 [2024-04-25 20:55:03.610419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.610445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.050 [2024-04-25 20:55:03.610497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30302030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.610511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.050 #59 NEW cov: 11941 ft: 15150 corp: 41/794b lim: 45 exec/s: 59 rss: 70Mb L: 25/45 MS: 1 ChangeBit- 00:07:48.050 [2024-04-25 20:55:03.650411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00003a00 cdw11:000a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.650436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.050 #60 NEW cov: 11941 ft: 15158 corp: 42/803b lim: 45 exec/s: 60 rss: 70Mb L: 9/45 MS: 1 ChangeByte- 00:07:48.050 [2024-04-25 20:55:03.690556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00302900 cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.050 [2024-04-25 20:55:03.690581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.309 #61 NEW cov: 11941 ft: 15213 corp: 43/812b lim: 45 exec/s: 61 rss: 70Mb L: 9/45 MS: 1 ChangeByte- 00:07:48.309 [2024-04-25 20:55:03.730834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.309 [2024-04-25 20:55:03.730860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.309 [2024-04-25 20:55:03.730913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.309 [2024-04-25 20:55:03.730927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.309 #62 NEW cov: 11941 ft: 15238 corp: 44/837b lim: 45 exec/s: 62 rss: 70Mb L: 25/45 MS: 1 ChangeASCIIInt- 00:07:48.309 [2024-04-25 20:55:03.770862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30d03030 cdw11:cfcf0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.309 [2024-04-25 20:55:03.770887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.309 [2024-04-25 20:55:03.770938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3030cfc9 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.309 [2024-04-25 20:55:03.770951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.309 #63 NEW cov: 11941 ft: 15265 corp: 45/858b lim: 45 exec/s: 63 rss: 70Mb L: 21/45 MS: 1 ChangeBinInt- 00:07:48.309 [2024-04-25 20:55:03.811029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.309 [2024-04-25 20:55:03.811053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.309 [2024-04-25 20:55:03.811104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:30303030 cdw11:30300001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.309 [2024-04-25 20:55:03.811117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.309 #64 pulse cov: 11941 ft: 15276 corp: 45/858b lim: 45 exec/s: 32 rss: 70Mb 00:07:48.309 #64 NEW cov: 11941 ft: 15276 corp: 46/879b lim: 45 exec/s: 32 rss: 70Mb L: 21/45 MS: 1 CopyPart- 00:07:48.309 #64 DONE cov: 11941 ft: 15276 corp: 46/879b lim: 45 exec/s: 32 rss: 70Mb 00:07:48.309 ###### Recommended dictionary. ###### 00:07:48.309 "\362\334\036\234\344\177\000\000" # Uses: 1 00:07:48.309 "\001\000\000\000\000\000\004\000" # Uses: 0 00:07:48.309 ###### End of recommended dictionary. ###### 00:07:48.309 Done 64 runs in 2 second(s) 00:07:48.309 20:55:03 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:48.309 20:55:03 -- ../common.sh@72 -- # (( i++ )) 00:07:48.309 20:55:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.309 20:55:03 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:48.309 20:55:03 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:48.309 20:55:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:48.309 20:55:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.309 20:55:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.309 20:55:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:48.309 20:55:03 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:48.309 20:55:03 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:48.309 20:55:03 -- nvmf/run.sh@34 -- # printf %02d 6 00:07:48.309 20:55:03 -- nvmf/run.sh@34 -- # port=4406 00:07:48.309 20:55:03 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.309 20:55:03 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:48.309 20:55:03 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.309 20:55:03 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:48.309 20:55:03 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:48.309 20:55:03 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:48.569 [2024-04-25 20:55:03.975619] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:48.569 [2024-04-25 20:55:03.975701] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid194981 ] 00:07:48.569 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.569 [2024-04-25 20:55:04.114580] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:48.569 [2024-04-25 20:55:04.152621] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.569 [2024-04-25 20:55:04.172088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.569 [2024-04-25 20:55:04.224565] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.837 [2024-04-25 20:55:04.240830] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:48.837 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.837 INFO: Seed: 165994934 00:07:48.837 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:48.837 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:48.837 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.837 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.837 #2 INITED exec/s: 0 rss: 60Mb 00:07:48.837 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.837 This may also happen if the target rejected all inputs we tried so far 00:07:48.837 [2024-04-25 20:55:04.286184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:48.837 [2024-04-25 20:55:04.286213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.838 [2024-04-25 20:55:04.286265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:48.838 [2024-04-25 20:55:04.286280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.097 NEW_FUNC[1/668]: 0x4ae770 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:49.097 NEW_FUNC[2/668]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:49.097 #3 NEW cov: 11613 ft: 11614 corp: 2/5b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:49.098 [2024-04-25 20:55:04.597030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fee cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.597061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.098 [2024-04-25 20:55:04.597119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.597136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.098 NEW_FUNC[1/1]: 0xfcb7a0 in _sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1323 00:07:49.098 #6 NEW cov: 11744 ft: 12199 corp: 3/10b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 3 ChangeByte-ShuffleBytes-CrossOver- 00:07:49.098 [2024-04-25 20:55:04.637081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fee cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.637106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.098 [2024-04-25 20:55:04.637163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003fee cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.637177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.098 #7 NEW cov: 11750 ft: 12402 corp: 4/15b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 ChangeByte- 00:07:49.098 [2024-04-25 20:55:04.677474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.677499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.098 [2024-04-25 20:55:04.677553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ee5e cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.677567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.098 [2024-04-25 20:55:04.677623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005e5e cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.677637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.098 [2024-04-25 20:55:04.677690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005e0a cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.677703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.098 #8 NEW cov: 11835 ft: 12918 corp: 5/23b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:49.098 [2024-04-25 20:55:04.727217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:49.098 [2024-04-25 20:55:04.727244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.098 #9 NEW cov: 11835 ft: 13288 corp: 6/26b lim: 10 exec/s: 0 rss: 68Mb L: 3/8 MS: 1 EraseBytes- 00:07:49.357 [2024-04-25 20:55:04.767484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.767509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.357 [2024-04-25 20:55:04.767566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.767581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.357 #10 NEW cov: 11835 ft: 13362 corp: 7/31b lim: 10 exec/s: 0 rss: 69Mb L: 5/8 MS: 1 InsertByte- 00:07:49.357 [2024-04-25 20:55:04.807884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.807909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.357 [2024-04-25 20:55:04.807966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.807980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.357 [2024-04-25 20:55:04.808046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.808060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.357 [2024-04-25 20:55:04.808116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.808129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.357 [2024-04-25 20:55:04.808186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.357 [2024-04-25 20:55:04.808200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.357 #11 NEW cov: 11835 ft: 13448 corp: 8/41b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:49.357 [2024-04-25 20:55:04.847752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000997f cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.847776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:04.847832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.847845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:04.847897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.847911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.358 #12 NEW cov: 11835 ft: 13610 corp: 9/47b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 InsertByte- 00:07:49.358 [2024-04-25 20:55:04.887924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009981 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.887950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:04.888008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001111 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.888022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:04.888079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00001b0a cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.888093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.358 #13 NEW cov: 11835 ft: 13647 corp: 10/53b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 ChangeBinInt- 00:07:49.358 [2024-04-25 20:55:04.927900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.927925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:04.927981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.928000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.358 #14 NEW cov: 11835 ft: 13686 corp: 11/58b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:07:49.358 [2024-04-25 20:55:04.967894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:49.358 [2024-04-25 20:55:04.967920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.358 #15 NEW cov: 11835 ft: 13710 corp: 12/60b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:07:49.358 [2024-04-25 20:55:05.008520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:05.008546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:05.008602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:05.008617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:05.008672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:05.008687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:05.008744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bbbc cdw11:00000000 00:07:49.358 [2024-04-25 20:55:05.008758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.358 [2024-04-25 20:55:05.008813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.358 [2024-04-25 20:55:05.008826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.618 #16 NEW cov: 11835 ft: 13795 corp: 13/70b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:49.618 [2024-04-25 20:55:05.048278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.048305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.618 [2024-04-25 20:55:05.048361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000043bd cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.048375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.618 #17 NEW cov: 11835 ft: 13854 corp: 14/75b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 ChangeBinInt- 00:07:49.618 [2024-04-25 20:55:05.088259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001b1b cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.088284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.618 #21 NEW cov: 11835 ft: 13885 corp: 15/77b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 4 ShuffleBytes-ChangeBit-ChangeBit-CopyPart- 00:07:49.618 [2024-04-25 20:55:05.128730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001bff cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.128755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.618 [2024-04-25 20:55:05.128813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.128827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.618 [2024-04-25 20:55:05.128884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.128899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.618 [2024-04-25 20:55:05.128958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.128971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.618 #22 NEW cov: 11835 ft: 13961 corp: 16/86b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:49.618 [2024-04-25 20:55:05.178616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000243 cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.178642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.618 [2024-04-25 20:55:05.178699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.178713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.618 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.618 #23 NEW cov: 11858 ft: 14034 corp: 17/91b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 ChangeBit- 00:07:49.618 [2024-04-25 20:55:05.218615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000099ee cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.218641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.618 #24 NEW cov: 11858 ft: 14057 corp: 18/94b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 EraseBytes- 00:07:49.618 [2024-04-25 20:55:05.258839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.258866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.618 [2024-04-25 20:55:05.258922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000043bd cdw11:00000000 00:07:49.618 [2024-04-25 20:55:05.258936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.878 #25 NEW cov: 11858 ft: 14123 corp: 19/99b lim: 10 exec/s: 25 rss: 70Mb L: 5/10 MS: 1 ChangeBit- 00:07:49.878 [2024-04-25 20:55:05.299237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.299263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.299320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.299335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.299392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.299406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.299462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.299476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.878 #26 NEW cov: 11858 ft: 14146 corp: 20/108b lim: 10 exec/s: 26 rss: 70Mb L: 9/10 MS: 1 EraseBytes- 00:07:49.878 [2024-04-25 20:55:05.338917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000099ee cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.338943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 #27 NEW cov: 11858 ft: 14236 corp: 21/111b lim: 10 exec/s: 27 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:49.878 [2024-04-25 20:55:05.379345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fe6 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.379370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.379430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.379444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.379500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.379514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.878 #28 NEW cov: 11858 ft: 14260 corp: 22/117b lim: 10 exec/s: 28 rss: 70Mb L: 6/10 MS: 1 InsertByte- 00:07:49.878 [2024-04-25 20:55:05.419397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006b6b cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.419422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.419478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006b6b cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.419492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.419548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006b6b cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.419562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.878 #32 NEW cov: 11858 ft: 14381 corp: 23/124b lim: 10 exec/s: 32 rss: 70Mb L: 7/10 MS: 4 EraseBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:49.878 [2024-04-25 20:55:05.459277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000beee cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.459302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 #33 NEW cov: 11858 ft: 14386 corp: 24/127b lim: 10 exec/s: 33 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:49.878 [2024-04-25 20:55:05.499650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.499675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.499733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.499747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.499802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.499817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.878 #34 NEW cov: 11858 ft: 14397 corp: 25/133b lim: 10 exec/s: 34 rss: 70Mb L: 6/10 MS: 1 InsertByte- 00:07:49.878 [2024-04-25 20:55:05.539801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fee cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.539826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.539885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000233f cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.539899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.878 [2024-04-25 20:55:05.539957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:49.878 [2024-04-25 20:55:05.539974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.138 #35 NEW cov: 11858 ft: 14400 corp: 26/139b lim: 10 exec/s: 35 rss: 70Mb L: 6/10 MS: 1 InsertByte- 00:07:50.138 [2024-04-25 20:55:05.579708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a99 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.579732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.579790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008111 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.579804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.138 #37 NEW cov: 11858 ft: 14414 corp: 27/144b lim: 10 exec/s: 37 rss: 70Mb L: 5/10 MS: 2 CopyPart-CrossOver- 00:07:50.138 [2024-04-25 20:55:05.609939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000197f cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.609964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.610024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000eeee cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.610039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.610095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.610109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.138 #38 NEW cov: 11858 ft: 14441 corp: 28/150b lim: 10 exec/s: 38 rss: 70Mb L: 6/10 MS: 1 ChangeBit- 00:07:50.138 [2024-04-25 20:55:05.650194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001bff cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.650219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.650274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.650288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.650343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.650356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.650412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff1b cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.650425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.138 #39 NEW cov: 11858 ft: 14451 corp: 29/158b lim: 10 exec/s: 39 rss: 70Mb L: 8/10 MS: 1 EraseBytes- 00:07:50.138 [2024-04-25 20:55:05.690339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001bff cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.690363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.690418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.690432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.690488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.690504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.690559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffc4 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.690573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.138 #40 NEW cov: 11858 ft: 14452 corp: 30/167b lim: 10 exec/s: 40 rss: 70Mb L: 9/10 MS: 1 InsertByte- 00:07:50.138 [2024-04-25 20:55:05.730528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.730553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.730611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.730625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.730679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.730693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.730748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.730761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.730815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.730828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.138 #41 NEW cov: 11858 ft: 14470 corp: 31/177b lim: 10 exec/s: 41 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:50.138 [2024-04-25 20:55:05.770385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ed80 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.770410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.770467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001111 cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.770481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.138 [2024-04-25 20:55:05.770534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:50.138 [2024-04-25 20:55:05.770548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.138 #42 NEW cov: 11858 ft: 14505 corp: 32/183b lim: 10 exec/s: 42 rss: 70Mb L: 6/10 MS: 1 ChangeBinInt- 00:07:50.397 [2024-04-25 20:55:05.810265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008111 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.810290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.398 #43 NEW cov: 11858 ft: 14553 corp: 33/186b lim: 10 exec/s: 43 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:50.398 [2024-04-25 20:55:05.850381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001b0b cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.850406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.398 #44 NEW cov: 11858 ft: 14561 corp: 34/188b lim: 10 exec/s: 44 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:50.398 [2024-04-25 20:55:05.891028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007f23 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.891056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.891115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003fee cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.891129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.891183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000aee cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.891197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.891251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000233f cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.891265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.891321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.891335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.398 #45 NEW cov: 11858 ft: 14571 corp: 35/198b lim: 10 exec/s: 45 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:50.398 [2024-04-25 20:55:05.931147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.931172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.931227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.931241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.931294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.931308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.931363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bbbc cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.931376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.931430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.931444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.398 #46 NEW cov: 11858 ft: 14583 corp: 36/208b lim: 10 exec/s: 46 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:50.398 [2024-04-25 20:55:05.971229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a43 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.971254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.971311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.971325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.971380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000b3bc cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.971394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.971450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.971464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:05.971518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:05.971532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.398 #47 NEW cov: 11858 ft: 14587 corp: 37/218b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:50.398 [2024-04-25 20:55:06.011228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001b09 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:06.011253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:06.011310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.398 [2024-04-25 20:55:06.011323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:06.011379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.398 [2024-04-25 20:55:06.011393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.398 [2024-04-25 20:55:06.011446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffc4 cdw11:00000000 00:07:50.398 [2024-04-25 20:55:06.011460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.398 #48 NEW cov: 11858 ft: 14592 corp: 38/227b lim: 10 exec/s: 48 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:50.398 [2024-04-25 20:55:06.050944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:50.398 [2024-04-25 20:55:06.050968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 #49 NEW cov: 11858 ft: 14595 corp: 39/229b lim: 10 exec/s: 49 rss: 70Mb L: 2/10 MS: 1 EraseBytes- 00:07:50.657 [2024-04-25 20:55:06.091482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.091507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.091562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e17f cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.091576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.091630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ee3f cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.091644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.091699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ee0a cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.091712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.657 #50 NEW cov: 11858 ft: 14599 corp: 40/237b lim: 10 exec/s: 50 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:50.657 [2024-04-25 20:55:06.131424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007fee cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.131450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.131511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003fee cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.131525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.131582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000240a cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.131595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.657 #51 NEW cov: 11858 ft: 14602 corp: 41/243b lim: 10 exec/s: 51 rss: 70Mb L: 6/10 MS: 1 InsertByte- 00:07:50.657 [2024-04-25 20:55:06.161549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001b09 cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.161574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.161633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.161648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.161705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff1b cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.161718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.657 #52 NEW cov: 11858 ft: 14618 corp: 42/249b lim: 10 exec/s: 52 rss: 71Mb L: 6/10 MS: 1 EraseBytes- 00:07:50.657 [2024-04-25 20:55:06.201550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ee5e cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.201576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.201634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005e0a cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.201646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.657 #53 NEW cov: 11858 ft: 14638 corp: 43/253b lim: 10 exec/s: 53 rss: 71Mb L: 4/10 MS: 1 EraseBytes- 00:07:50.657 [2024-04-25 20:55:06.241792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000243 cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.241817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.241873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff43 cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.241887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.657 [2024-04-25 20:55:06.241942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004343 cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.241956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.657 #54 NEW cov: 11858 ft: 14644 corp: 44/259b lim: 10 exec/s: 54 rss: 71Mb L: 6/10 MS: 1 InsertByte- 00:07:50.657 [2024-04-25 20:55:06.281624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ee02 cdw11:00000000 00:07:50.657 [2024-04-25 20:55:06.281649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.657 #55 NEW cov: 11858 ft: 14688 corp: 45/261b lim: 10 exec/s: 27 rss: 71Mb L: 2/10 MS: 1 ChangeBit- 00:07:50.657 #55 DONE cov: 11858 ft: 14688 corp: 45/261b lim: 10 exec/s: 27 rss: 71Mb 00:07:50.657 Done 55 runs in 2 second(s) 00:07:50.917 20:55:06 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:50.917 20:55:06 -- ../common.sh@72 -- # (( i++ )) 00:07:50.917 20:55:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.917 20:55:06 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:50.917 20:55:06 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:50.917 20:55:06 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.917 20:55:06 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.917 20:55:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.917 20:55:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:50.917 20:55:06 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:50.917 20:55:06 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:50.917 20:55:06 -- nvmf/run.sh@34 -- # printf %02d 7 00:07:50.917 20:55:06 -- nvmf/run.sh@34 -- # port=4407 00:07:50.917 20:55:06 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.917 20:55:06 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:50.917 20:55:06 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.917 20:55:06 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.917 20:55:06 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:50.917 20:55:06 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:50.917 [2024-04-25 20:55:06.456357] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:50.917 [2024-04-25 20:55:06.456426] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195276 ] 00:07:50.917 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.176 [2024-04-25 20:55:06.601125] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:51.176 [2024-04-25 20:55:06.638507] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.176 [2024-04-25 20:55:06.658388] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.176 [2024-04-25 20:55:06.710496] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.176 [2024-04-25 20:55:06.726707] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:51.176 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.176 INFO: Seed: 2649997657 00:07:51.176 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:51.176 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:51.176 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:51.176 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.176 #2 INITED exec/s: 0 rss: 60Mb 00:07:51.176 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.176 This may also happen if the target rejected all inputs we tried so far 00:07:51.176 [2024-04-25 20:55:06.796400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.176 [2024-04-25 20:55:06.796437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 NEW_FUNC[1/669]: 0x4af160 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:51.744 NEW_FUNC[2/669]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.744 #3 NEW cov: 11614 ft: 11599 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:51.744 [2024-04-25 20:55:07.136953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.136990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 #5 NEW cov: 11744 ft: 12127 corp: 3/6b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 2 ChangeBit-CrossOver- 00:07:51.744 [2024-04-25 20:55:07.177502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.177529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 [2024-04-25 20:55:07.177637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.177656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.744 [2024-04-25 20:55:07.177765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.177783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.744 [2024-04-25 20:55:07.177895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.177913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.744 #6 NEW cov: 11750 ft: 12755 corp: 4/15b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CMP- DE: "\211\262\251K\364\374v\000"- 00:07:51.744 [2024-04-25 20:55:07.217140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.217168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 #8 NEW cov: 11835 ft: 12966 corp: 5/17b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 2 ChangeBit-CopyPart- 00:07:51.744 [2024-04-25 20:55:07.257256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a64 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.257284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 #10 NEW cov: 11835 ft: 13013 corp: 6/19b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 2 ShuffleBytes-InsertByte- 00:07:51.744 [2024-04-25 20:55:07.297335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.297362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 #11 NEW cov: 11835 ft: 13056 corp: 7/21b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ChangeBit- 00:07:51.744 [2024-04-25 20:55:07.337963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.337998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 [2024-04-25 20:55:07.338125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.338143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.744 [2024-04-25 20:55:07.338272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.338289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.744 #12 NEW cov: 11835 ft: 13263 corp: 8/28b lim: 10 exec/s: 0 rss: 68Mb L: 7/9 MS: 1 EraseBytes- 00:07:51.744 [2024-04-25 20:55:07.387821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000092a cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.387849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.744 [2024-04-25 20:55:07.387961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.744 [2024-04-25 20:55:07.387980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.003 #13 NEW cov: 11835 ft: 13447 corp: 9/32b lim: 10 exec/s: 0 rss: 69Mb L: 4/9 MS: 1 InsertByte- 00:07:52.003 [2024-04-25 20:55:07.438606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:52.003 [2024-04-25 20:55:07.438633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.003 [2024-04-25 20:55:07.438743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.003 [2024-04-25 20:55:07.438762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.003 [2024-04-25 20:55:07.438871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f44b cdw11:00000000 00:07:52.003 [2024-04-25 20:55:07.438889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.003 [2024-04-25 20:55:07.439010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f4fc cdw11:00000000 00:07:52.003 [2024-04-25 20:55:07.439027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.003 [2024-04-25 20:55:07.439146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00007600 cdw11:00000000 00:07:52.003 [2024-04-25 20:55:07.439162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.003 #14 NEW cov: 11835 ft: 13516 corp: 10/42b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:52.004 [2024-04-25 20:55:07.477877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000092a cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.477902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.004 #15 NEW cov: 11835 ft: 13552 corp: 11/45b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 EraseBytes- 00:07:52.004 [2024-04-25 20:55:07.518075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a04a cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.518103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.004 #16 NEW cov: 11835 ft: 13581 corp: 12/47b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:52.004 [2024-04-25 20:55:07.558148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.558174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.004 #17 NEW cov: 11835 ft: 13599 corp: 13/49b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:52.004 [2024-04-25 20:55:07.598255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a06 cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.598283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.004 #18 NEW cov: 11835 ft: 13718 corp: 14/51b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:52.004 [2024-04-25 20:55:07.638672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.638700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.004 [2024-04-25 20:55:07.638826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.638844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.004 [2024-04-25 20:55:07.638956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.004 [2024-04-25 20:55:07.638975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.263 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.263 #19 NEW cov: 11858 ft: 13730 corp: 15/57b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 EraseBytes- 00:07:52.263 [2024-04-25 20:55:07.688503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a42 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.688530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.263 #20 NEW cov: 11858 ft: 13742 corp: 16/59b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:07:52.263 [2024-04-25 20:55:07.728569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.728597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.263 #21 NEW cov: 11858 ft: 13757 corp: 17/62b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 CrossOver- 00:07:52.263 [2024-04-25 20:55:07.759250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.759276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.263 [2024-04-25 20:55:07.759399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.759418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.263 [2024-04-25 20:55:07.759532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.759549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.263 [2024-04-25 20:55:07.759660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.759677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.263 #22 NEW cov: 11858 ft: 13784 corp: 18/71b lim: 10 exec/s: 22 rss: 69Mb L: 9/10 MS: 1 PersAutoDict- DE: "\211\262\251K\364\374v\000"- 00:07:52.263 [2024-04-25 20:55:07.799349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.799376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.263 [2024-04-25 20:55:07.799484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.799502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.263 [2024-04-25 20:55:07.799613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.799631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.263 [2024-04-25 20:55:07.799749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f476 cdw11:00000000 00:07:52.263 [2024-04-25 20:55:07.799767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.263 #23 NEW cov: 11858 ft: 13837 corp: 19/80b lim: 10 exec/s: 23 rss: 69Mb L: 9/10 MS: 1 ChangeBit- 00:07:52.264 [2024-04-25 20:55:07.839750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.839777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.264 [2024-04-25 20:55:07.839882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004bff cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.839899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.264 [2024-04-25 20:55:07.840012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.840030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.264 [2024-04-25 20:55:07.840140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fff4 cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.840160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.264 [2024-04-25 20:55:07.840272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.840290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.264 #24 NEW cov: 11858 ft: 13906 corp: 20/90b lim: 10 exec/s: 24 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:52.264 [2024-04-25 20:55:07.879271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.879296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.264 [2024-04-25 20:55:07.879408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.879426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.264 #25 NEW cov: 11858 ft: 13918 corp: 21/95b lim: 10 exec/s: 25 rss: 69Mb L: 5/10 MS: 1 EraseBytes- 00:07:52.264 [2024-04-25 20:55:07.919147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.264 [2024-04-25 20:55:07.919173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 #26 NEW cov: 11858 ft: 13925 corp: 22/98b lim: 10 exec/s: 26 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:52.523 [2024-04-25 20:55:07.959350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a65 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:07.959376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 #27 NEW cov: 11858 ft: 13959 corp: 23/100b lim: 10 exec/s: 27 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:52.523 [2024-04-25 20:55:07.999485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a8a cdw11:00000000 00:07:52.523 [2024-04-25 20:55:07.999512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 #28 NEW cov: 11858 ft: 13964 corp: 24/103b lim: 10 exec/s: 28 rss: 70Mb L: 3/10 MS: 1 ChangeBit- 00:07:52.523 [2024-04-25 20:55:08.040372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.040401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 [2024-04-25 20:55:08.040521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.040539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.523 [2024-04-25 20:55:08.040649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.040668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.523 [2024-04-25 20:55:08.040780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f476 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.040796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.523 [2024-04-25 20:55:08.040901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000002a cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.040919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.523 #29 NEW cov: 11858 ft: 14026 corp: 25/113b lim: 10 exec/s: 29 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:52.523 [2024-04-25 20:55:08.089756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a017 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.089784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 #30 NEW cov: 11858 ft: 14036 corp: 26/115b lim: 10 exec/s: 30 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:52.523 [2024-04-25 20:55:08.139857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a4a cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.139886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 #31 NEW cov: 11858 ft: 14040 corp: 27/118b lim: 10 exec/s: 31 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:52.523 [2024-04-25 20:55:08.180242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004e89 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.180270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.523 [2024-04-25 20:55:08.180383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.523 [2024-04-25 20:55:08.180411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.784 #32 NEW cov: 11858 ft: 14061 corp: 28/123b lim: 10 exec/s: 32 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:52.784 [2024-04-25 20:55:08.220111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a4a cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.220139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.784 #33 NEW cov: 11858 ft: 14083 corp: 29/125b lim: 10 exec/s: 33 rss: 70Mb L: 2/10 MS: 1 CrossOver- 00:07:52.784 [2024-04-25 20:55:08.260808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.260835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.260948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.260967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.261086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.261104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.261219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.261238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.261353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000002a cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.261370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.784 #34 NEW cov: 11858 ft: 14116 corp: 30/135b lim: 10 exec/s: 34 rss: 70Mb L: 10/10 MS: 1 CMP- DE: "\000\000\000\000\001\000\000\000"- 00:07:52.784 [2024-04-25 20:55:08.310796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.310825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.310943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004bf4 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.310962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.311087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.311106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.784 #35 NEW cov: 11858 ft: 14173 corp: 31/142b lim: 10 exec/s: 35 rss: 70Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:52.784 [2024-04-25 20:55:08.351207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b25f cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.351234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.351347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b400 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.351367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.351479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.351496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.351610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.351630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.351745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000376 cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.351764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.784 #36 NEW cov: 11858 ft: 14229 corp: 32/152b lim: 10 exec/s: 36 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:52.784 [2024-04-25 20:55:08.400833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000092a cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.400861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.400979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.401004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.784 #37 NEW cov: 11858 ft: 14247 corp: 33/156b lim: 10 exec/s: 37 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:52.784 [2024-04-25 20:55:08.440956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000092a cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.440982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.784 [2024-04-25 20:55:08.441099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:52.784 [2024-04-25 20:55:08.441119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.061 #38 NEW cov: 11858 ft: 14250 corp: 34/160b lim: 10 exec/s: 38 rss: 70Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:53.061 [2024-04-25 20:55:08.480927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002364 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.480954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.062 #39 NEW cov: 11858 ft: 14259 corp: 35/162b lim: 10 exec/s: 39 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:53.062 [2024-04-25 20:55:08.521654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000089b2 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.521683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.521796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a94b cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.521815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.521925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f4f4 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.521944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.522061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002a0a cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.522080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.062 #40 NEW cov: 11858 ft: 14277 corp: 36/171b lim: 10 exec/s: 40 rss: 70Mb L: 9/10 MS: 1 CrossOver- 00:07:53.062 [2024-04-25 20:55:08.561797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.561824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.561932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.561950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.562058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004bf5 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.562081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.562197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.562215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.062 #41 NEW cov: 11858 ft: 14309 corp: 37/180b lim: 10 exec/s: 41 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:53.062 [2024-04-25 20:55:08.602022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.602050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.602158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004bff cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.602174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.602287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.602304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.602421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fff4 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.602438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.602554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.602570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.062 #42 NEW cov: 11858 ft: 14365 corp: 38/190b lim: 10 exec/s: 42 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:53.062 [2024-04-25 20:55:08.641356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002028 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.641382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.062 #46 NEW cov: 11858 ft: 14388 corp: 39/192b lim: 10 exec/s: 46 rss: 70Mb L: 2/10 MS: 4 EraseBytes-ChangeByte-ShuffleBytes-InsertByte- 00:07:53.062 [2024-04-25 20:55:08.682285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.682311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.682422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000b2a9 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.682440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.682551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000f44b cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.682569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.682678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000f400 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.682695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.062 [2024-04-25 20:55:08.682814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00007600 cdw11:00000000 00:07:53.062 [2024-04-25 20:55:08.682832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.062 #47 NEW cov: 11858 ft: 14395 corp: 40/202b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 1 CrossOver- 00:07:53.327 [2024-04-25 20:55:08.722020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a89 cdw11:00000000 00:07:53.327 [2024-04-25 20:55:08.722048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.327 [2024-04-25 20:55:08.722185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fc76 cdw11:00000000 00:07:53.327 [2024-04-25 20:55:08.722203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.327 [2024-04-25 20:55:08.722315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:53.327 [2024-04-25 20:55:08.722334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.327 #48 NEW cov: 11858 ft: 14400 corp: 41/209b lim: 10 exec/s: 48 rss: 70Mb L: 7/10 MS: 1 CrossOver- 00:07:53.327 [2024-04-25 20:55:08.761851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a0a cdw11:00000000 00:07:53.327 [2024-04-25 20:55:08.761878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.327 #49 NEW cov: 11858 ft: 14406 corp: 42/211b lim: 10 exec/s: 24 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:53.327 #49 DONE cov: 11858 ft: 14406 corp: 42/211b lim: 10 exec/s: 24 rss: 70Mb 00:07:53.327 ###### Recommended dictionary. ###### 00:07:53.327 "\211\262\251K\364\374v\000" # Uses: 1 00:07:53.327 "\000\000\000\000\001\000\000\000" # Uses: 0 00:07:53.327 ###### End of recommended dictionary. ###### 00:07:53.327 Done 49 runs in 2 second(s) 00:07:53.327 20:55:08 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:07:53.327 20:55:08 -- ../common.sh@72 -- # (( i++ )) 00:07:53.327 20:55:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.327 20:55:08 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:53.327 20:55:08 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:53.327 20:55:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.327 20:55:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.327 20:55:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.327 20:55:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:53.327 20:55:08 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:53.327 20:55:08 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:53.327 20:55:08 -- nvmf/run.sh@34 -- # printf %02d 8 00:07:53.327 20:55:08 -- nvmf/run.sh@34 -- # port=4408 00:07:53.327 20:55:08 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.327 20:55:08 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:53.327 20:55:08 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.327 20:55:08 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.327 20:55:08 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:53.327 20:55:08 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:07:53.327 [2024-04-25 20:55:08.910687] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:53.327 [2024-04-25 20:55:08.910739] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195805 ] 00:07:53.327 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.585 [2024-04-25 20:55:09.042167] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:53.585 [2024-04-25 20:55:09.078599] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.585 [2024-04-25 20:55:09.097599] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.585 [2024-04-25 20:55:09.149613] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.585 [2024-04-25 20:55:09.165912] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:53.585 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.585 INFO: Seed: 796034346 00:07:53.585 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:53.585 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:53.585 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.585 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.585 [2024-04-25 20:55:09.211236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.585 [2024-04-25 20:55:09.211263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.585 #2 INITED cov: 11625 ft: 11643 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:53.585 [2024-04-25 20:55:09.241503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.585 [2024-04-25 20:55:09.241528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.585 [2024-04-25 20:55:09.241587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.585 [2024-04-25 20:55:09.241601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.585 [2024-04-25 20:55:09.241659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.585 [2024-04-25 20:55:09.241673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.100 NEW_FUNC[1/1]: 0xf20050 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:415 00:07:54.100 #3 NEW cov: 11772 ft: 12742 corp: 2/4b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CMP- DE: "\035\000"- 00:07:54.100 [2024-04-25 20:55:09.573401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.100 [2024-04-25 20:55:09.573451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.100 [2024-04-25 20:55:09.573595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.100 [2024-04-25 20:55:09.573619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.101 [2024-04-25 20:55:09.573756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.573780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.101 #4 NEW cov: 11778 ft: 13225 corp: 3/7b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 PersAutoDict- DE: "\035\000"- 00:07:54.101 [2024-04-25 20:55:09.623499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.623529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.101 [2024-04-25 20:55:09.623654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.623672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.101 [2024-04-25 20:55:09.623808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.623828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.101 #5 NEW cov: 11863 ft: 13520 corp: 4/10b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CopyPart- 00:07:54.101 [2024-04-25 20:55:09.683686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.683715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.101 [2024-04-25 20:55:09.683851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.683872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.101 [2024-04-25 20:55:09.683999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.684019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.101 #6 NEW cov: 11863 ft: 13735 corp: 5/13b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeBit- 00:07:54.101 [2024-04-25 20:55:09.733485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.733512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.101 [2024-04-25 20:55:09.733639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.101 [2024-04-25 20:55:09.733657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.101 #7 NEW cov: 11863 ft: 14000 corp: 6/15b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 EraseBytes- 00:07:54.359 [2024-04-25 20:55:09.783680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.783707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.359 [2024-04-25 20:55:09.783832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.783851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.359 #8 NEW cov: 11863 ft: 14030 corp: 7/17b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 InsertByte- 00:07:54.359 [2024-04-25 20:55:09.833879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.833907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.359 [2024-04-25 20:55:09.834035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.834054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.359 #9 NEW cov: 11863 ft: 14059 corp: 8/19b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 EraseBytes- 00:07:54.359 [2024-04-25 20:55:09.894318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.894349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.359 [2024-04-25 20:55:09.894481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.894499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.359 [2024-04-25 20:55:09.894632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.894649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.359 #10 NEW cov: 11863 ft: 14092 corp: 9/22b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeByte- 00:07:54.359 [2024-04-25 20:55:09.944208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.944236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.359 [2024-04-25 20:55:09.944360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.944377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.359 #11 NEW cov: 11863 ft: 14126 corp: 10/24b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ChangeBit- 00:07:54.359 [2024-04-25 20:55:09.994348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.994375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.359 [2024-04-25 20:55:09.994504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.359 [2024-04-25 20:55:09.994521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.617 #12 NEW cov: 11863 ft: 14155 corp: 11/26b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CrossOver- 00:07:54.617 [2024-04-25 20:55:10.054612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.054648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.617 [2024-04-25 20:55:10.054778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.054795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.617 #13 NEW cov: 11863 ft: 14186 corp: 12/28b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:07:54.617 [2024-04-25 20:55:10.104742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.104770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.617 [2024-04-25 20:55:10.104902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.104919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.617 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.617 #14 NEW cov: 11886 ft: 14230 corp: 13/30b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:07:54.617 [2024-04-25 20:55:10.154138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.154167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.617 #15 NEW cov: 11886 ft: 14303 corp: 14/31b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 EraseBytes- 00:07:54.617 [2024-04-25 20:55:10.194563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.194592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.617 [2024-04-25 20:55:10.194720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.194735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.617 #16 NEW cov: 11886 ft: 14334 corp: 15/33b lim: 5 exec/s: 16 rss: 69Mb L: 2/3 MS: 1 ChangeByte- 00:07:54.617 [2024-04-25 20:55:10.234695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.234722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.617 [2024-04-25 20:55:10.234843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.617 [2024-04-25 20:55:10.234860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.617 #17 NEW cov: 11886 ft: 14361 corp: 16/35b lim: 5 exec/s: 17 rss: 69Mb L: 2/3 MS: 1 ChangeBit- 00:07:54.876 [2024-04-25 20:55:10.285239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.285266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.285387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.285406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.876 #18 NEW cov: 11886 ft: 14393 corp: 17/37b lim: 5 exec/s: 18 rss: 69Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:54.876 [2024-04-25 20:55:10.335426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.335451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.335592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.335608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.876 #19 NEW cov: 11886 ft: 14436 corp: 18/39b lim: 5 exec/s: 19 rss: 69Mb L: 2/3 MS: 1 ChangeByte- 00:07:54.876 [2024-04-25 20:55:10.375824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.375854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.375996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.376015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.376142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.376160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.876 #20 NEW cov: 11886 ft: 14442 corp: 19/42b lim: 5 exec/s: 20 rss: 69Mb L: 3/3 MS: 1 ChangeBit- 00:07:54.876 [2024-04-25 20:55:10.425929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.425959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.426095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.426115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.426238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.426259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.876 #21 NEW cov: 11886 ft: 14446 corp: 20/45b lim: 5 exec/s: 21 rss: 69Mb L: 3/3 MS: 1 ChangeBinInt- 00:07:54.876 [2024-04-25 20:55:10.466298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.466326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.466452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.466471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.466594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.466612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.876 [2024-04-25 20:55:10.466745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.466763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.876 #22 NEW cov: 11886 ft: 14719 corp: 21/49b lim: 5 exec/s: 22 rss: 69Mb L: 4/4 MS: 1 CrossOver- 00:07:54.876 [2024-04-25 20:55:10.505499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.876 [2024-04-25 20:55:10.505525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.876 #23 NEW cov: 11886 ft: 14771 corp: 22/50b lim: 5 exec/s: 23 rss: 69Mb L: 1/4 MS: 1 EraseBytes- 00:07:55.134 [2024-04-25 20:55:10.556605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.134 [2024-04-25 20:55:10.556632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.134 [2024-04-25 20:55:10.556757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.134 [2024-04-25 20:55:10.556775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.134 [2024-04-25 20:55:10.556899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.134 [2024-04-25 20:55:10.556917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.134 [2024-04-25 20:55:10.557048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.134 [2024-04-25 20:55:10.557066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.134 #24 NEW cov: 11886 ft: 14792 corp: 23/54b lim: 5 exec/s: 24 rss: 69Mb L: 4/4 MS: 1 CrossOver- 00:07:55.134 [2024-04-25 20:55:10.616525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.134 [2024-04-25 20:55:10.616554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.134 [2024-04-25 20:55:10.616690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.134 [2024-04-25 20:55:10.616707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.616832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.616849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.135 #25 NEW cov: 11886 ft: 14802 corp: 24/57b lim: 5 exec/s: 25 rss: 70Mb L: 3/4 MS: 1 ChangeBinInt- 00:07:55.135 [2024-04-25 20:55:10.667243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.667270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.667399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.667416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.667554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.667572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.667705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.667724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.667856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.667874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.135 #26 NEW cov: 11886 ft: 14858 corp: 25/62b lim: 5 exec/s: 26 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:07:55.135 [2024-04-25 20:55:10.727078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.727111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.727252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.727272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.727405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.727423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.727560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.727577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.135 #27 NEW cov: 11886 ft: 14873 corp: 26/66b lim: 5 exec/s: 27 rss: 70Mb L: 4/5 MS: 1 EraseBytes- 00:07:55.135 [2024-04-25 20:55:10.787662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.787692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.787807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.787824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.787947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.787966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.788088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.788107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.135 [2024-04-25 20:55:10.788232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.135 [2024-04-25 20:55:10.788252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.393 #28 NEW cov: 11886 ft: 14888 corp: 27/71b lim: 5 exec/s: 28 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:55.393 [2024-04-25 20:55:10.847245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.393 [2024-04-25 20:55:10.847274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.393 [2024-04-25 20:55:10.847407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.847425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.847550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.847569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.394 #29 NEW cov: 11886 ft: 14900 corp: 28/74b lim: 5 exec/s: 29 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:55.394 [2024-04-25 20:55:10.886800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.886827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.886945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.886964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.394 #30 NEW cov: 11886 ft: 14920 corp: 29/76b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:55.394 [2024-04-25 20:55:10.927275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.927301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.927419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.927437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.927561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.927593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.927725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.927744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.927867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.927885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.394 #31 NEW cov: 11886 ft: 14927 corp: 30/81b lim: 5 exec/s: 31 rss: 70Mb L: 5/5 MS: 1 PersAutoDict- DE: "\035\000"- 00:07:55.394 [2024-04-25 20:55:10.967284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.967310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.967443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.967463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.394 [2024-04-25 20:55:10.967590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:10.967608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.394 #32 NEW cov: 11886 ft: 14946 corp: 31/84b lim: 5 exec/s: 32 rss: 70Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:55.394 [2024-04-25 20:55:11.017132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.394 [2024-04-25 20:55:11.017166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.394 #33 NEW cov: 11886 ft: 14947 corp: 32/85b lim: 5 exec/s: 33 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:55.653 [2024-04-25 20:55:11.057812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.057840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.653 [2024-04-25 20:55:11.057972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.057997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.653 [2024-04-25 20:55:11.058126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.058142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.653 #34 NEW cov: 11886 ft: 14955 corp: 33/88b lim: 5 exec/s: 34 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:55.653 [2024-04-25 20:55:11.107928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.107957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.653 [2024-04-25 20:55:11.108091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.108112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.653 [2024-04-25 20:55:11.108238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.108256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.653 #35 NEW cov: 11886 ft: 14957 corp: 34/91b lim: 5 exec/s: 35 rss: 70Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:55.653 [2024-04-25 20:55:11.157473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.157501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.653 [2024-04-25 20:55:11.157628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.157646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.653 #36 NEW cov: 11886 ft: 14962 corp: 35/93b lim: 5 exec/s: 36 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:55.653 [2024-04-25 20:55:11.217929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.217958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.653 [2024-04-25 20:55:11.218084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.653 [2024-04-25 20:55:11.218101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.653 #37 NEW cov: 11886 ft: 14972 corp: 36/95b lim: 5 exec/s: 18 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:55.653 #37 DONE cov: 11886 ft: 14972 corp: 36/95b lim: 5 exec/s: 18 rss: 70Mb 00:07:55.653 ###### Recommended dictionary. ###### 00:07:55.653 "\035\000" # Uses: 2 00:07:55.653 ###### End of recommended dictionary. ###### 00:07:55.653 Done 37 runs in 2 second(s) 00:07:55.913 20:55:11 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:07:55.913 20:55:11 -- ../common.sh@72 -- # (( i++ )) 00:07:55.913 20:55:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.913 20:55:11 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:55.913 20:55:11 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:55.913 20:55:11 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.913 20:55:11 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.913 20:55:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.913 20:55:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:55.913 20:55:11 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.913 20:55:11 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.913 20:55:11 -- nvmf/run.sh@34 -- # printf %02d 9 00:07:55.913 20:55:11 -- nvmf/run.sh@34 -- # port=4409 00:07:55.913 20:55:11 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.913 20:55:11 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:55.913 20:55:11 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.913 20:55:11 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.913 20:55:11 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.913 20:55:11 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:07:55.913 [2024-04-25 20:55:11.394151] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:55.913 [2024-04-25 20:55:11.394221] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196203 ] 00:07:55.913 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.913 [2024-04-25 20:55:11.535943] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:55.913 [2024-04-25 20:55:11.574352] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.173 [2024-04-25 20:55:11.595632] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.173 [2024-04-25 20:55:11.647962] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.173 [2024-04-25 20:55:11.664261] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:56.173 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.173 INFO: Seed: 3294044246 00:07:56.173 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:56.173 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:56.173 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:56.173 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.173 [2024-04-25 20:55:11.709469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.173 [2024-04-25 20:55:11.709497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.173 #2 INITED cov: 11603 ft: 11637 corp: 1/1b exec/s: 0 rss: 67Mb 00:07:56.173 [2024-04-25 20:55:11.739588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.173 [2024-04-25 20:55:11.739612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.173 [2024-04-25 20:55:11.739666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.173 [2024-04-25 20:55:11.739680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.432 NEW_FUNC[1/4]: 0x1d01fd0 in spdk_thread_is_exited /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:726 00:07:56.432 NEW_FUNC[2/4]: 0x1d02bf0 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:795 00:07:56.432 #3 NEW cov: 11772 ft: 12884 corp: 2/3b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:56.432 [2024-04-25 20:55:12.060332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.432 [2024-04-25 20:55:12.060363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.432 #4 NEW cov: 11778 ft: 13061 corp: 3/4b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 EraseBytes- 00:07:56.692 [2024-04-25 20:55:12.100496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.100519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.100575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.100588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.692 #5 NEW cov: 11863 ft: 13293 corp: 4/6b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:07:56.692 [2024-04-25 20:55:12.140640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.140664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.140716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.140729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.692 #6 NEW cov: 11863 ft: 13386 corp: 5/8b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:07:56.692 [2024-04-25 20:55:12.181025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.181049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.181105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.181119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.181172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.181185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.181235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.181248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.692 #7 NEW cov: 11863 ft: 13768 corp: 6/12b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 CopyPart- 00:07:56.692 [2024-04-25 20:55:12.220704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.220728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 #8 NEW cov: 11863 ft: 13819 corp: 7/13b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:56.692 [2024-04-25 20:55:12.261629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.261652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.261723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.261736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.261788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.261800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.261849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.261863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.261917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.261931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.692 #9 NEW cov: 11863 ft: 13954 corp: 8/18b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:56.692 [2024-04-25 20:55:12.301521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.301544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.301595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.301613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.301667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.301681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.301732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.301744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.301797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.301810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.692 #10 NEW cov: 11863 ft: 13978 corp: 9/23b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:07:56.692 [2024-04-25 20:55:12.341484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.341508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.341560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.341575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.341627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.341640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.692 [2024-04-25 20:55:12.341693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.692 [2024-04-25 20:55:12.341707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.951 #11 NEW cov: 11863 ft: 14041 corp: 10/27b lim: 5 exec/s: 0 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:56.951 [2024-04-25 20:55:12.391770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.951 [2024-04-25 20:55:12.391794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.951 [2024-04-25 20:55:12.391863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.391876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.952 [2024-04-25 20:55:12.391928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.391941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.952 [2024-04-25 20:55:12.391996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.392009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.952 [2024-04-25 20:55:12.392065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.392079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.952 #12 NEW cov: 11863 ft: 14104 corp: 11/32b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:56.952 [2024-04-25 20:55:12.441445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.441469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.952 [2024-04-25 20:55:12.441522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.441535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.952 #13 NEW cov: 11863 ft: 14204 corp: 12/34b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:56.952 [2024-04-25 20:55:12.481542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.481566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.952 [2024-04-25 20:55:12.481620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.481636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.952 #14 NEW cov: 11863 ft: 14226 corp: 13/36b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:56.952 [2024-04-25 20:55:12.521542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.521566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.952 #15 NEW cov: 11863 ft: 14232 corp: 14/37b lim: 5 exec/s: 0 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:56.952 [2024-04-25 20:55:12.561816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.561840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.952 [2024-04-25 20:55:12.561895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.561908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.952 #16 NEW cov: 11863 ft: 14248 corp: 15/39b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:56.952 [2024-04-25 20:55:12.601737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.952 [2024-04-25 20:55:12.601760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.212 #17 NEW cov: 11886 ft: 14274 corp: 16/40b lim: 5 exec/s: 0 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:07:57.212 [2024-04-25 20:55:12.642446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.642474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.642544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.642558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.642608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.642622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.642674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.642687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.642740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.642754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.212 #18 NEW cov: 11886 ft: 14345 corp: 17/45b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:57.212 [2024-04-25 20:55:12.682341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.682365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.682419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.682435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.682490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.682503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.212 #19 NEW cov: 11886 ft: 14505 corp: 18/48b lim: 5 exec/s: 19 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:57.212 [2024-04-25 20:55:12.722068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.722092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 #20 NEW cov: 11886 ft: 14534 corp: 19/49b lim: 5 exec/s: 20 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:57.212 [2024-04-25 20:55:12.762794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.762818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.762871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.762885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.762936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.762952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.763006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.763018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.763088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.763102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.212 #21 NEW cov: 11886 ft: 14589 corp: 20/54b lim: 5 exec/s: 21 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\000\000\002\000"- 00:07:57.212 [2024-04-25 20:55:12.802991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.803019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.803087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.803101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.803152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.803165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.803216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.803229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.803291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.803304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.212 #22 NEW cov: 11886 ft: 14626 corp: 21/59b lim: 5 exec/s: 22 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:57.212 [2024-04-25 20:55:12.842747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.842770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.842824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.842838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.212 [2024-04-25 20:55:12.842891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.212 [2024-04-25 20:55:12.842904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.212 #23 NEW cov: 11886 ft: 14640 corp: 22/62b lim: 5 exec/s: 23 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:57.471 [2024-04-25 20:55:12.883164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.471 [2024-04-25 20:55:12.883188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.471 [2024-04-25 20:55:12.883257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.471 [2024-04-25 20:55:12.883270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.883323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.883337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.883390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.883402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.883453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.883466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.472 #24 NEW cov: 11886 ft: 14662 corp: 23/67b lim: 5 exec/s: 24 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:57.472 [2024-04-25 20:55:12.933160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.933184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.933236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.933253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.933303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.933316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.933367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.933380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.472 #25 NEW cov: 11886 ft: 14669 corp: 24/71b lim: 5 exec/s: 25 rss: 70Mb L: 4/5 MS: 1 CopyPart- 00:07:57.472 [2024-04-25 20:55:12.972952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.972975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:12.973045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:12.973059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.472 #26 NEW cov: 11886 ft: 14699 corp: 25/73b lim: 5 exec/s: 26 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:57.472 [2024-04-25 20:55:13.013515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.013540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.013610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.013624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.013675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.013688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.013740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.013753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.013803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.013817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.472 #27 NEW cov: 11886 ft: 14732 corp: 26/78b lim: 5 exec/s: 27 rss: 70Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:57.472 [2024-04-25 20:55:13.053642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.053667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.053718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.053733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.053785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.053798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.053849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.053862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.053913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.053925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.472 #28 NEW cov: 11886 ft: 14783 corp: 27/83b lim: 5 exec/s: 28 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:57.472 [2024-04-25 20:55:13.103772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.103797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.103852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.103865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.103917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.103929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.103982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.103998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.472 [2024-04-25 20:55:13.104067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.472 [2024-04-25 20:55:13.104081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.472 #29 NEW cov: 11886 ft: 14796 corp: 28/88b lim: 5 exec/s: 29 rss: 70Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:57.732 [2024-04-25 20:55:13.143894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.143917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.143985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.144003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.144054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.144067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.144118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.144131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.144183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.144197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.732 #30 NEW cov: 11886 ft: 14822 corp: 29/93b lim: 5 exec/s: 30 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:57.732 [2024-04-25 20:55:13.183827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.183851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.183921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.183935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.183999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.184013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.184066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.184078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.732 #31 NEW cov: 11886 ft: 14834 corp: 30/97b lim: 5 exec/s: 31 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:57.732 [2024-04-25 20:55:13.223714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.223739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.223792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.223808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.732 #32 NEW cov: 11886 ft: 14852 corp: 31/99b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:57.732 [2024-04-25 20:55:13.264279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.264302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.264370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.264384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.264438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.264450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.264502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.264514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.264568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.264582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.732 #33 NEW cov: 11886 ft: 14866 corp: 32/104b lim: 5 exec/s: 33 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:57.732 [2024-04-25 20:55:13.304228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.304251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.304304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.304320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.304372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.304384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.304435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.304448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.732 #34 NEW cov: 11886 ft: 14875 corp: 33/108b lim: 5 exec/s: 34 rss: 70Mb L: 4/5 MS: 1 CMP- DE: "\000\000\000\001"- 00:07:57.732 [2024-04-25 20:55:13.344474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.344498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.344550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.344569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.344621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.344634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.344684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.344696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.732 [2024-04-25 20:55:13.344747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.344760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.732 #35 NEW cov: 11886 ft: 14900 corp: 34/113b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:57.732 [2024-04-25 20:55:13.384019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.732 [2024-04-25 20:55:13.384043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.992 #36 NEW cov: 11886 ft: 14934 corp: 35/114b lim: 5 exec/s: 36 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:57.992 [2024-04-25 20:55:13.424715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.424739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.424793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.424812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.424882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.424897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.424948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.424961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.425015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.425029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.992 #37 NEW cov: 11886 ft: 14940 corp: 36/119b lim: 5 exec/s: 37 rss: 70Mb L: 5/5 MS: 1 ChangeByte- 00:07:57.992 [2024-04-25 20:55:13.464514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.464537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.464591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.464603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.464654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.464667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.992 #38 NEW cov: 11886 ft: 14966 corp: 37/122b lim: 5 exec/s: 38 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:57.992 [2024-04-25 20:55:13.504791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.504815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.504867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.504881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.504934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.504946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.505001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.505014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.992 #39 NEW cov: 11886 ft: 14974 corp: 38/126b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 EraseBytes- 00:07:57.992 [2024-04-25 20:55:13.544784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.544807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.544878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.544894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.544948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.544961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.992 #40 NEW cov: 11886 ft: 14981 corp: 39/129b lim: 5 exec/s: 40 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:57.992 [2024-04-25 20:55:13.585226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.992 [2024-04-25 20:55:13.585249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.992 [2024-04-25 20:55:13.585300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.585321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.993 [2024-04-25 20:55:13.585371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.585384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.993 [2024-04-25 20:55:13.585434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.585447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.993 [2024-04-25 20:55:13.585500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.585513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.993 #41 NEW cov: 11886 ft: 14992 corp: 40/134b lim: 5 exec/s: 41 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:57.993 [2024-04-25 20:55:13.624990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.625018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.993 [2024-04-25 20:55:13.625098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.625111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.993 [2024-04-25 20:55:13.625162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.993 [2024-04-25 20:55:13.625175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.993 #42 NEW cov: 11886 ft: 14995 corp: 41/137b lim: 5 exec/s: 42 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:58.253 [2024-04-25 20:55:13.665404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.665428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.253 [2024-04-25 20:55:13.665483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.665496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.253 [2024-04-25 20:55:13.665549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.665561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.253 [2024-04-25 20:55:13.665613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.665627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.253 [2024-04-25 20:55:13.665680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.665693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.253 #43 NEW cov: 11886 ft: 15001 corp: 42/142b lim: 5 exec/s: 43 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:58.253 [2024-04-25 20:55:13.705112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.705136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.253 [2024-04-25 20:55:13.705192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.253 [2024-04-25 20:55:13.705205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.253 #44 NEW cov: 11886 ft: 15005 corp: 43/144b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:58.253 #44 DONE cov: 11886 ft: 15005 corp: 43/144b lim: 5 exec/s: 22 rss: 70Mb 00:07:58.253 ###### Recommended dictionary. ###### 00:07:58.253 "\000\000\002\000" # Uses: 0 00:07:58.253 "\001\000\000\000" # Uses: 0 00:07:58.253 "\000\000\000\001" # Uses: 0 00:07:58.253 ###### End of recommended dictionary. ###### 00:07:58.253 Done 44 runs in 2 second(s) 00:07:58.253 20:55:13 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.253 20:55:13 -- ../common.sh@72 -- # (( i++ )) 00:07:58.253 20:55:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.253 20:55:13 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:58.253 20:55:13 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:58.254 20:55:13 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.254 20:55:13 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.254 20:55:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.254 20:55:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:58.254 20:55:13 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.254 20:55:13 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.254 20:55:13 -- nvmf/run.sh@34 -- # printf %02d 10 00:07:58.254 20:55:13 -- nvmf/run.sh@34 -- # port=4410 00:07:58.254 20:55:13 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.254 20:55:13 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:58.254 20:55:13 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.254 20:55:13 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.254 20:55:13 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.254 20:55:13 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:07:58.254 [2024-04-25 20:55:13.854822] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:07:58.254 [2024-04-25 20:55:13.854880] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid196622 ] 00:07:58.254 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.513 [2024-04-25 20:55:13.990293] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:58.513 [2024-04-25 20:55:14.026591] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.513 [2024-04-25 20:55:14.046016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.513 [2024-04-25 20:55:14.098199] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.513 [2024-04-25 20:55:14.114508] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:58.513 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.513 INFO: Seed: 1450067829 00:07:58.513 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:07:58.513 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:07:58.513 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.513 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.513 #2 INITED exec/s: 0 rss: 60Mb 00:07:58.513 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.513 This may also happen if the target rejected all inputs we tried so far 00:07:58.513 [2024-04-25 20:55:14.169986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a0707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.513 [2024-04-25 20:55:14.170020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.513 [2024-04-25 20:55:14.170080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.513 [2024-04-25 20:55:14.170094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.513 [2024-04-25 20:55:14.170150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.513 [2024-04-25 20:55:14.170163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.032 NEW_FUNC[1/670]: 0x4b0ad0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:59.032 NEW_FUNC[2/670]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.032 #5 NEW cov: 11665 ft: 11666 corp: 2/30b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:07:59.032 [2024-04-25 20:55:14.480692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.032 [2024-04-25 20:55:14.480724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.032 [2024-04-25 20:55:14.480785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.480799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.480859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.480873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.033 #6 NEW cov: 11795 ft: 12175 corp: 3/59b lim: 40 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ChangeByte- 00:07:59.033 [2024-04-25 20:55:14.530661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.530686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.530742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.530756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.033 #7 NEW cov: 11801 ft: 12751 corp: 4/81b lim: 40 exec/s: 0 rss: 68Mb L: 22/29 MS: 1 EraseBytes- 00:07:59.033 [2024-04-25 20:55:14.570859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a0707 cdw11:07270707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.570883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.570943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.570957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.571011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.571025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.033 #8 NEW cov: 11886 ft: 12993 corp: 5/110b lim: 40 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ChangeBit- 00:07:59.033 [2024-04-25 20:55:14.610944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.610968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.611019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.611032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.611084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.611098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.033 #13 NEW cov: 11886 ft: 13099 corp: 6/135b lim: 40 exec/s: 0 rss: 68Mb L: 25/29 MS: 5 ChangeBinInt-ChangeByte-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:59.033 [2024-04-25 20:55:14.651049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a0707 cdw11:07270707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.651074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.651134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.651151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.651208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.651222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.033 #14 NEW cov: 11886 ft: 13275 corp: 7/165b lim: 40 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertByte- 00:07:59.033 [2024-04-25 20:55:14.691150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.691175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.691232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.691245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.033 [2024-04-25 20:55:14.691304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07270707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.033 [2024-04-25 20:55:14.691328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.293 #15 NEW cov: 11886 ft: 13330 corp: 8/194b lim: 40 exec/s: 0 rss: 68Mb L: 29/30 MS: 1 ChangeBit- 00:07:59.293 [2024-04-25 20:55:14.731153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.731177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.731234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.731247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.293 #16 NEW cov: 11886 ft: 13355 corp: 9/210b lim: 40 exec/s: 0 rss: 68Mb L: 16/30 MS: 1 EraseBytes- 00:07:59.293 [2024-04-25 20:55:14.771274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.771297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.771353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07072707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.771366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.293 #22 NEW cov: 11886 ft: 13393 corp: 10/228b lim: 40 exec/s: 0 rss: 69Mb L: 18/30 MS: 1 EraseBytes- 00:07:59.293 [2024-04-25 20:55:14.811362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3bba0707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.811386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.811445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.811458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.293 #23 NEW cov: 11886 ft: 13431 corp: 11/244b lim: 40 exec/s: 0 rss: 69Mb L: 16/30 MS: 1 CopyPart- 00:07:59.293 [2024-04-25 20:55:14.851471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.851495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.851553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:0aba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.851568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.293 #24 NEW cov: 11886 ft: 13445 corp: 12/260b lim: 40 exec/s: 0 rss: 69Mb L: 16/30 MS: 1 CopyPart- 00:07:59.293 [2024-04-25 20:55:14.891574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b070707 cdw11:07ba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.891598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.891656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.891669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.293 #25 NEW cov: 11886 ft: 13475 corp: 13/276b lim: 40 exec/s: 0 rss: 69Mb L: 16/30 MS: 1 ShuffleBytes- 00:07:59.293 [2024-04-25 20:55:14.931859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.931883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.931940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.931953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.293 [2024-04-25 20:55:14.932013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.293 [2024-04-25 20:55:14.932027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.554 #26 NEW cov: 11886 ft: 13479 corp: 14/301b lim: 40 exec/s: 0 rss: 69Mb L: 25/30 MS: 1 CopyPart- 00:07:59.554 [2024-04-25 20:55:14.972067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:14.972091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:14.972149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:14.972163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:14.972218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0aba cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:14.972231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:14.972286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:14.972302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.554 #27 NEW cov: 11886 ft: 13945 corp: 15/340b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:59.554 [2024-04-25 20:55:15.012020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.012044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.012101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.012115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.012170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070705 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.012183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.554 #28 NEW cov: 11886 ft: 13956 corp: 16/369b lim: 40 exec/s: 0 rss: 69Mb L: 29/39 MS: 1 ChangeBit- 00:07:59.554 [2024-04-25 20:55:15.051988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b070707 cdw11:07ba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.052015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.052072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.052085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.554 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.554 #29 NEW cov: 11909 ft: 13992 corp: 17/391b lim: 40 exec/s: 0 rss: 69Mb L: 22/39 MS: 1 InsertRepeatedBytes- 00:07:59.554 [2024-04-25 20:55:15.092249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a072c cdw11:07072707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.092273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.092330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.092344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.092400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.092413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.554 #30 NEW cov: 11909 ft: 14042 corp: 18/422b lim: 40 exec/s: 0 rss: 69Mb L: 31/39 MS: 1 InsertByte- 00:07:59.554 [2024-04-25 20:55:15.132243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.132267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.132324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.132342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.554 #32 NEW cov: 11909 ft: 14056 corp: 19/443b lim: 40 exec/s: 32 rss: 69Mb L: 21/39 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:59.554 [2024-04-25 20:55:15.172469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.172492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.172550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.172564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.172623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.172637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.554 #33 NEW cov: 11909 ft: 14085 corp: 20/469b lim: 40 exec/s: 33 rss: 69Mb L: 26/39 MS: 1 InsertByte- 00:07:59.554 [2024-04-25 20:55:15.212467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b070707 cdw11:07ba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.212492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.554 [2024-04-25 20:55:15.212550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.554 [2024-04-25 20:55:15.212563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.814 #34 NEW cov: 11909 ft: 14095 corp: 21/491b lim: 40 exec/s: 34 rss: 70Mb L: 22/39 MS: 1 ShuffleBytes- 00:07:59.814 [2024-04-25 20:55:15.252452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b073bff cdw11:ffffff07 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.814 [2024-04-25 20:55:15.252476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.815 #35 NEW cov: 11909 ft: 14415 corp: 22/501b lim: 40 exec/s: 35 rss: 70Mb L: 10/39 MS: 1 CrossOver- 00:07:59.815 [2024-04-25 20:55:15.292911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.292935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.292996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fffdffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.293010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.293084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0aba cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.293098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.293155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.293168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.815 #41 NEW cov: 11909 ft: 14429 corp: 23/540b lim: 40 exec/s: 41 rss: 70Mb L: 39/39 MS: 1 ChangeBit- 00:07:59.815 [2024-04-25 20:55:15.332688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b073bff cdw11:fffbff07 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.332713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.815 #47 NEW cov: 11909 ft: 14444 corp: 24/550b lim: 40 exec/s: 47 rss: 70Mb L: 10/39 MS: 1 ChangeBit- 00:07:59.815 [2024-04-25 20:55:15.373047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a0707 cdw11:07270707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.373072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.373129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07075507 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.373143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.373201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.373214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.815 #48 NEW cov: 11909 ft: 14467 corp: 25/581b lim: 40 exec/s: 48 rss: 70Mb L: 31/39 MS: 1 InsertByte- 00:07:59.815 [2024-04-25 20:55:15.413030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3bba0707 cdw11:07071000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.413055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.413114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.413127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.815 #49 NEW cov: 11909 ft: 14505 corp: 26/597b lim: 40 exec/s: 49 rss: 70Mb L: 16/39 MS: 1 ChangeBinInt- 00:07:59.815 [2024-04-25 20:55:15.453188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b070707 cdw11:07ba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.453212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.815 [2024-04-25 20:55:15.453271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07ba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.815 [2024-04-25 20:55:15.453284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.815 #50 NEW cov: 11909 ft: 14548 corp: 27/620b lim: 40 exec/s: 50 rss: 70Mb L: 23/39 MS: 1 CopyPart- 00:08:00.075 [2024-04-25 20:55:15.493428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.493452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.493510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.493523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.493579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07270707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.493595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.075 #51 NEW cov: 11909 ft: 14557 corp: 28/649b lim: 40 exec/s: 51 rss: 70Mb L: 29/39 MS: 1 ChangeByte- 00:08:00.075 [2024-04-25 20:55:15.533619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.533643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.533701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.533715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.533772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.533785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.533842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.533855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.075 #52 NEW cov: 11909 ft: 14585 corp: 29/683b lim: 40 exec/s: 52 rss: 70Mb L: 34/39 MS: 1 InsertRepeatedBytes- 00:08:00.075 [2024-04-25 20:55:15.573660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.573684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.573738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.573752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.573807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.573820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.075 #53 NEW cov: 11909 ft: 14603 corp: 30/713b lim: 40 exec/s: 53 rss: 70Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:08:00.075 [2024-04-25 20:55:15.613761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.613785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.613845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.613859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.613918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.613931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.075 #55 NEW cov: 11909 ft: 14615 corp: 31/738b lim: 40 exec/s: 55 rss: 70Mb L: 25/39 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:00.075 [2024-04-25 20:55:15.643735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b070707 cdw11:07ba0707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.643759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.643816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.643830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.075 #56 NEW cov: 11909 ft: 14627 corp: 32/760b lim: 40 exec/s: 56 rss: 70Mb L: 22/39 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:00.075 [2024-04-25 20:55:15.684098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:03000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.684122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.684182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.684195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.075 [2024-04-25 20:55:15.684252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.075 [2024-04-25 20:55:15.684265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.076 [2024-04-25 20:55:15.684324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.076 [2024-04-25 20:55:15.684337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.076 #57 NEW cov: 11909 ft: 14640 corp: 33/793b lim: 40 exec/s: 57 rss: 70Mb L: 33/39 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:00.076 [2024-04-25 20:55:15.723885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b073bf6 cdw11:ffffff07 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.076 [2024-04-25 20:55:15.723909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 #58 NEW cov: 11909 ft: 14660 corp: 34/803b lim: 40 exec/s: 58 rss: 70Mb L: 10/39 MS: 1 ChangeByte- 00:08:00.335 [2024-04-25 20:55:15.764248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07f9f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.764272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.764327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:fc070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.764340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.764392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.764405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.335 #59 NEW cov: 11909 ft: 14693 corp: 35/832b lim: 40 exec/s: 59 rss: 70Mb L: 29/39 MS: 1 ChangeBinInt- 00:08:00.335 [2024-04-25 20:55:15.804242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0a0781 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.804270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.804324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.804338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 #60 NEW cov: 11909 ft: 14705 corp: 36/850b lim: 40 exec/s: 60 rss: 70Mb L: 18/39 MS: 1 CrossOver- 00:08:00.335 [2024-04-25 20:55:15.844537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:03000000 cdw11:0000000b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.844561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.844616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.844630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.844684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.844697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.844750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.844763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.335 #61 NEW cov: 11909 ft: 14723 corp: 37/883b lim: 40 exec/s: 61 rss: 70Mb L: 33/39 MS: 1 ChangeByte- 00:08:00.335 [2024-04-25 20:55:15.884585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.884610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.884666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.884679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.884736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.884749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.335 #63 NEW cov: 11909 ft: 14726 corp: 38/910b lim: 40 exec/s: 63 rss: 70Mb L: 27/39 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:00.335 [2024-04-25 20:55:15.914516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.914539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.914593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07072707 cdw11:05070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.914607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 #64 NEW cov: 11909 ft: 14774 corp: 39/928b lim: 40 exec/s: 64 rss: 70Mb L: 18/39 MS: 1 ChangeBit- 00:08:00.335 [2024-04-25 20:55:15.954914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.954937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.954996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.955009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.955064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a078181 cdw11:81818107 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.955077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.955132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.955145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.335 #65 NEW cov: 11909 ft: 14809 corp: 40/961b lim: 40 exec/s: 65 rss: 70Mb L: 33/39 MS: 1 CrossOver- 00:08:00.335 [2024-04-25 20:55:15.995089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.995113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.995166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.995179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.995232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.995245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.335 [2024-04-25 20:55:15.995299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.335 [2024-04-25 20:55:15.995312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.596 #66 NEW cov: 11909 ft: 14814 corp: 41/1000b lim: 40 exec/s: 66 rss: 70Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:00.596 [2024-04-25 20:55:16.035019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.035043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.035100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.035113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.035168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.035181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.596 #67 NEW cov: 11909 ft: 14821 corp: 42/1025b lim: 40 exec/s: 67 rss: 70Mb L: 25/39 MS: 1 CrossOver- 00:08:00.596 [2024-04-25 20:55:16.075243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.075267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.075321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.075335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.075388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:07078c8c cdw11:8c8c8c8c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.075402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.075452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8c8c8c8c cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.075465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.596 #68 NEW cov: 11909 ft: 14832 corp: 43/1057b lim: 40 exec/s: 68 rss: 70Mb L: 32/39 MS: 1 InsertRepeatedBytes- 00:08:00.596 [2024-04-25 20:55:16.115138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3b0aba07 cdw11:07070707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.115162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.115216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:07000000 cdw11:00072707 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.115229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.596 #69 NEW cov: 11909 ft: 14844 corp: 44/1079b lim: 40 exec/s: 69 rss: 70Mb L: 22/39 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:00.596 [2024-04-25 20:55:16.155381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.155404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.155461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:81818181 cdw11:81818181 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.155474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.155529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a078181 cdw11:81818107 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.155542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.596 [2024-04-25 20:55:16.155596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:03000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.596 [2024-04-25 20:55:16.155609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.596 #70 NEW cov: 11909 ft: 14846 corp: 45/1112b lim: 40 exec/s: 35 rss: 70Mb L: 33/39 MS: 1 PersAutoDict- DE: "\003\000\000\000\000\000\000\000"- 00:08:00.596 #70 DONE cov: 11909 ft: 14846 corp: 45/1112b lim: 40 exec/s: 35 rss: 70Mb 00:08:00.596 ###### Recommended dictionary. ###### 00:08:00.596 "\000\000\000\000" # Uses: 1 00:08:00.596 "\003\000\000\000\000\000\000\000" # Uses: 1 00:08:00.596 ###### End of recommended dictionary. ###### 00:08:00.596 Done 70 runs in 2 second(s) 00:08:00.856 20:55:16 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.856 20:55:16 -- ../common.sh@72 -- # (( i++ )) 00:08:00.856 20:55:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.856 20:55:16 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:00.856 20:55:16 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:00.856 20:55:16 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.856 20:55:16 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.856 20:55:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.856 20:55:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:00.856 20:55:16 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.856 20:55:16 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.856 20:55:16 -- nvmf/run.sh@34 -- # printf %02d 11 00:08:00.856 20:55:16 -- nvmf/run.sh@34 -- # port=4411 00:08:00.856 20:55:16 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.856 20:55:16 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:00.856 20:55:16 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.856 20:55:16 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.856 20:55:16 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.856 20:55:16 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:00.856 [2024-04-25 20:55:16.335056] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:00.856 [2024-04-25 20:55:16.335127] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197151 ] 00:08:00.856 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.856 [2024-04-25 20:55:16.479034] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:00.856 [2024-04-25 20:55:16.517426] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.115 [2024-04-25 20:55:16.537062] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.115 [2024-04-25 20:55:16.589323] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.115 [2024-04-25 20:55:16.605638] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:01.115 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.115 INFO: Seed: 3941081314 00:08:01.115 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:01.115 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:01.115 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:01.115 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.115 #2 INITED exec/s: 0 rss: 60Mb 00:08:01.115 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.115 This may also happen if the target rejected all inputs we tried so far 00:08:01.115 [2024-04-25 20:55:16.681937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:24000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.115 [2024-04-25 20:55:16.681972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.115 [2024-04-25 20:55:16.682121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.115 [2024-04-25 20:55:16.682143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.374 NEW_FUNC[1/671]: 0x4b2840 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:01.374 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.374 #6 NEW cov: 11677 ft: 11676 corp: 2/24b lim: 40 exec/s: 0 rss: 68Mb L: 23/23 MS: 4 CrossOver-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:01.374 [2024-04-25 20:55:17.013406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.374 [2024-04-25 20:55:17.013446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.374 [2024-04-25 20:55:17.013578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.374 [2024-04-25 20:55:17.013596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.374 [2024-04-25 20:55:17.013740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.374 [2024-04-25 20:55:17.013758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.374 [2024-04-25 20:55:17.013888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.374 [2024-04-25 20:55:17.013906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.374 #7 NEW cov: 11807 ft: 12456 corp: 3/62b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:01.633 [2024-04-25 20:55:17.063464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.063493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.063631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.063650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.063783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.063802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.063939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.063956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.633 #12 NEW cov: 11813 ft: 12822 corp: 4/98b lim: 40 exec/s: 0 rss: 68Mb L: 36/38 MS: 5 ChangeBit-ShuffleBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:01.633 [2024-04-25 20:55:17.113615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.113643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.113776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.113794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.113930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.113947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.114077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:9f9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.114097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.633 #13 NEW cov: 11898 ft: 13042 corp: 5/137b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:01.633 [2024-04-25 20:55:17.174165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.174192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.174315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.174333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.174469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.174486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.174613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:9f9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.174631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.174760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.174776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.633 #14 NEW cov: 11898 ft: 13234 corp: 6/177b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:08:01.633 [2024-04-25 20:55:17.233966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ff03ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.233991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.234131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.234150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.234283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.234301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.633 [2024-04-25 20:55:17.234435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:9f9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.234455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.633 #15 NEW cov: 11898 ft: 13380 corp: 7/216b lim: 40 exec/s: 0 rss: 68Mb L: 39/40 MS: 1 ChangeBinInt- 00:08:01.633 [2024-04-25 20:55:17.283356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffff4aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.633 [2024-04-25 20:55:17.283386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.893 #16 NEW cov: 11898 ft: 14166 corp: 8/225b lim: 40 exec/s: 0 rss: 69Mb L: 9/40 MS: 1 CrossOver- 00:08:01.893 [2024-04-25 20:55:17.344397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ff03ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.344423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.893 [2024-04-25 20:55:17.344561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.344581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.893 [2024-04-25 20:55:17.344705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.344724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.893 [2024-04-25 20:55:17.344847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3b9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.344864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.893 #17 NEW cov: 11898 ft: 14214 corp: 9/264b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 ChangeByte- 00:08:01.893 [2024-04-25 20:55:17.404794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.404820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.893 [2024-04-25 20:55:17.404966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.404985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.893 [2024-04-25 20:55:17.405131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.893 [2024-04-25 20:55:17.405150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.405290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.405309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.405438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.405458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.894 #18 NEW cov: 11898 ft: 14245 corp: 10/304b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:08:01.894 [2024-04-25 20:55:17.454713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.454740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.454873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.454892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.455025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:5252ffff cdw11:4a525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.455045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.455179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:525252ff cdw11:52ff5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.455199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.894 #19 NEW cov: 11898 ft: 14274 corp: 11/337b lim: 40 exec/s: 0 rss: 69Mb L: 33/40 MS: 1 CrossOver- 00:08:01.894 [2024-04-25 20:55:17.514966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.514997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.515142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.515162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.515292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:5252ffff cdw11:4a525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.515311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.894 [2024-04-25 20:55:17.515438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:525252ff cdw11:52ff5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.894 [2024-04-25 20:55:17.515457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.894 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.894 #20 NEW cov: 11921 ft: 14293 corp: 12/370b lim: 40 exec/s: 0 rss: 69Mb L: 33/40 MS: 1 ShuffleBytes- 00:08:02.155 [2024-04-25 20:55:17.575143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.155 [2024-04-25 20:55:17.575174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.155 [2024-04-25 20:55:17.575303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.155 [2024-04-25 20:55:17.575322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.155 [2024-04-25 20:55:17.575460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.155 [2024-04-25 20:55:17.575479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.155 [2024-04-25 20:55:17.575621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.155 [2024-04-25 20:55:17.575639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.155 #21 NEW cov: 11921 ft: 14335 corp: 13/408b lim: 40 exec/s: 0 rss: 69Mb L: 38/40 MS: 1 CopyPart- 00:08:02.156 [2024-04-25 20:55:17.624730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.624759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.624899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.624919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.156 #22 NEW cov: 11921 ft: 14354 corp: 14/428b lim: 40 exec/s: 0 rss: 69Mb L: 20/40 MS: 1 EraseBytes- 00:08:02.156 [2024-04-25 20:55:17.674583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffff4aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.674612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.156 #23 NEW cov: 11921 ft: 14376 corp: 15/437b lim: 40 exec/s: 23 rss: 69Mb L: 9/40 MS: 1 ChangeBit- 00:08:02.156 [2024-04-25 20:55:17.735598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.735625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.735757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:97ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.735776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.735908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.735925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.736066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.736083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.156 #24 NEW cov: 11921 ft: 14409 corp: 16/476b lim: 40 exec/s: 24 rss: 69Mb L: 39/40 MS: 1 InsertByte- 00:08:02.156 [2024-04-25 20:55:17.795815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:24000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.795844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.795971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.795989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.796133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.796153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.156 [2024-04-25 20:55:17.796286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.156 [2024-04-25 20:55:17.796306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.525 #25 NEW cov: 11921 ft: 14522 corp: 17/512b lim: 40 exec/s: 25 rss: 69Mb L: 36/40 MS: 1 CrossOver- 00:08:02.526 [2024-04-25 20:55:17.856013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.856039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.856173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.856193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.856336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4052ffff cdw11:4a525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.856354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.856491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:525252ff cdw11:52ff5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.856510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.526 #26 NEW cov: 11921 ft: 14537 corp: 18/545b lim: 40 exec/s: 26 rss: 69Mb L: 33/40 MS: 1 ChangeByte- 00:08:02.526 [2024-04-25 20:55:17.905829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.905855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.905977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.905998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.906133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.906151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.526 #29 NEW cov: 11921 ft: 14741 corp: 19/571b lim: 40 exec/s: 29 rss: 69Mb L: 26/40 MS: 3 CrossOver-CMP-CrossOver- DE: "\006\000\000\000"- 00:08:02.526 [2024-04-25 20:55:17.956313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.956341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.956468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525212 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.956486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.956617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4052ffff cdw11:4a525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.956638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:17.956764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:525252ff cdw11:52ff5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:17.956784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.526 #30 NEW cov: 11921 ft: 14756 corp: 20/604b lim: 40 exec/s: 30 rss: 69Mb L: 33/40 MS: 1 ChangeBit- 00:08:02.526 [2024-04-25 20:55:18.005999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.006023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.006162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff04ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.006181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.006309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.006327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.006461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:9f9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.006480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.526 #31 NEW cov: 11921 ft: 14864 corp: 21/643b lim: 40 exec/s: 31 rss: 69Mb L: 39/40 MS: 1 ChangeBinInt- 00:08:02.526 [2024-04-25 20:55:18.055779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffff4aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.055807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 #32 NEW cov: 11921 ft: 14929 corp: 22/653b lim: 40 exec/s: 32 rss: 69Mb L: 10/40 MS: 1 InsertByte- 00:08:02.526 [2024-04-25 20:55:18.095662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.095688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.095813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.095833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.526 #33 NEW cov: 11921 ft: 14945 corp: 23/673b lim: 40 exec/s: 33 rss: 69Mb L: 20/40 MS: 1 ShuffleBytes- 00:08:02.526 [2024-04-25 20:55:18.146889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ff03ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.146915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.147062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.147079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.147219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.147241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.526 [2024-04-25 20:55:18.147378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3b9f9fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.526 [2024-04-25 20:55:18.147396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.786 #34 NEW cov: 11921 ft: 15031 corp: 24/712b lim: 40 exec/s: 34 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:08:02.786 [2024-04-25 20:55:18.207011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.207037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.207165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.207184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.207320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.207338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.207471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.207488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.786 #35 NEW cov: 11921 ft: 15044 corp: 25/750b lim: 40 exec/s: 35 rss: 70Mb L: 38/40 MS: 1 ShuffleBytes- 00:08:02.786 [2024-04-25 20:55:18.247177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4afffdff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.247202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.247322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.247340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.247471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.247488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.247620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.247638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.786 #36 NEW cov: 11921 ft: 15066 corp: 26/788b lim: 40 exec/s: 36 rss: 70Mb L: 38/40 MS: 1 ChangeBit- 00:08:02.786 [2024-04-25 20:55:18.286908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a0a5252 cdw11:52524aff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.286934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.287073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff525252 cdw11:52ff5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.287093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.287225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:525252ff cdw11:ff4a5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.287244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.287373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:52525252 cdw11:ff52ff52 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.287391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.786 #37 NEW cov: 11921 ft: 15081 corp: 27/822b lim: 40 exec/s: 37 rss: 70Mb L: 34/40 MS: 1 InsertByte- 00:08:02.786 [2024-04-25 20:55:18.337449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.337475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.337606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.337626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.337757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:5252ffff cdw11:4a521252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.337775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.337904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:525252ff cdw11:52ff5252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.337921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.786 #38 NEW cov: 11921 ft: 15092 corp: 28/855b lim: 40 exec/s: 38 rss: 70Mb L: 33/40 MS: 1 ChangeBit- 00:08:02.786 [2024-04-25 20:55:18.387616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.387643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.387788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.387807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.387937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4052ffff cdw11:4a525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.387952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.388094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:52522400 cdw11:00000052 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.388113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.786 #39 NEW cov: 11921 ft: 15099 corp: 29/893b lim: 40 exec/s: 39 rss: 70Mb L: 38/40 MS: 1 CrossOver- 00:08:02.786 [2024-04-25 20:55:18.437729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.437758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.437894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.437912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.438057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.438074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.786 [2024-04-25 20:55:18.438199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.786 [2024-04-25 20:55:18.438218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.046 #40 NEW cov: 11921 ft: 15119 corp: 30/929b lim: 40 exec/s: 40 rss: 70Mb L: 36/40 MS: 1 ShuffleBytes- 00:08:03.046 [2024-04-25 20:55:18.477322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.477347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.477477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.477494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.046 #41 NEW cov: 11921 ft: 15143 corp: 31/951b lim: 40 exec/s: 41 rss: 70Mb L: 22/40 MS: 1 EraseBytes- 00:08:03.046 [2024-04-25 20:55:18.528077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a525252 cdw11:524affff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.528104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.528225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:ff525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.528242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.528367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4052ffff cdw11:4a525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.528384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.528510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:52520000 cdw11:0052ff52 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.528527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.046 #42 NEW cov: 11921 ft: 15179 corp: 32/987b lim: 40 exec/s: 42 rss: 70Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:03.046 [2024-04-25 20:55:18.567782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.567808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.567944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.567962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.568102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52525252 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.568120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.046 [2024-04-25 20:55:18.568245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:52525206 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.568263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.046 #43 NEW cov: 11921 ft: 15190 corp: 33/1019b lim: 40 exec/s: 43 rss: 70Mb L: 32/40 MS: 1 CopyPart- 00:08:03.046 [2024-04-25 20:55:18.627511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4aff0afd cdw11:ffff52ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.046 [2024-04-25 20:55:18.627537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.046 #44 NEW cov: 11921 ft: 15216 corp: 34/1033b lim: 40 exec/s: 22 rss: 70Mb L: 14/40 MS: 1 CrossOver- 00:08:03.046 #44 DONE cov: 11921 ft: 15216 corp: 34/1033b lim: 40 exec/s: 22 rss: 70Mb 00:08:03.046 ###### Recommended dictionary. ###### 00:08:03.046 "\006\000\000\000" # Uses: 0 00:08:03.046 ###### End of recommended dictionary. ###### 00:08:03.046 Done 44 runs in 2 second(s) 00:08:03.306 20:55:18 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.306 20:55:18 -- ../common.sh@72 -- # (( i++ )) 00:08:03.306 20:55:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.306 20:55:18 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:03.306 20:55:18 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:03.306 20:55:18 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.306 20:55:18 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.306 20:55:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.306 20:55:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:03.306 20:55:18 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.306 20:55:18 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.306 20:55:18 -- nvmf/run.sh@34 -- # printf %02d 12 00:08:03.306 20:55:18 -- nvmf/run.sh@34 -- # port=4412 00:08:03.306 20:55:18 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.306 20:55:18 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:03.306 20:55:18 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.306 20:55:18 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.306 20:55:18 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.306 20:55:18 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:03.306 [2024-04-25 20:55:18.803412] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:03.306 [2024-04-25 20:55:18.803501] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197452 ] 00:08:03.306 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.306 [2024-04-25 20:55:18.949099] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:03.566 [2024-04-25 20:55:18.987214] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.566 [2024-04-25 20:55:19.006494] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.566 [2024-04-25 20:55:19.058607] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.566 [2024-04-25 20:55:19.074921] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:03.566 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.566 INFO: Seed: 2114101085 00:08:03.566 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:03.566 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:03.566 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.566 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.566 #2 INITED exec/s: 0 rss: 60Mb 00:08:03.566 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.566 This may also happen if the target rejected all inputs we tried so far 00:08:03.566 [2024-04-25 20:55:19.141705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.566 [2024-04-25 20:55:19.141741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.566 [2024-04-25 20:55:19.141872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.566 [2024-04-25 20:55:19.141892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.566 [2024-04-25 20:55:19.142018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.566 [2024-04-25 20:55:19.142038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.566 [2024-04-25 20:55:19.142157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66660000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.566 [2024-04-25 20:55:19.142174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.826 NEW_FUNC[1/671]: 0x4b45b0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:03.826 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.826 #6 NEW cov: 11669 ft: 11670 corp: 2/35b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 4 InsertRepeatedBytes-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:03.826 [2024-04-25 20:55:19.472545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.826 [2024-04-25 20:55:19.472588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.826 [2024-04-25 20:55:19.472721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:6666669a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.826 [2024-04-25 20:55:19.472741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.826 [2024-04-25 20:55:19.472876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999266 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.826 [2024-04-25 20:55:19.472895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.826 [2024-04-25 20:55:19.473034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66660000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.826 [2024-04-25 20:55:19.473063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.085 #7 NEW cov: 11805 ft: 12210 corp: 3/69b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:04.085 [2024-04-25 20:55:19.522307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.522337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.522464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.522483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.522603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.522621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.522744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666600 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.522761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.085 #8 NEW cov: 11811 ft: 12533 corp: 4/103b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:08:04.085 [2024-04-25 20:55:19.562506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.562533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.562664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.562682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.562801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.562819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.085 #9 NEW cov: 11896 ft: 13042 corp: 5/127b lim: 40 exec/s: 0 rss: 68Mb L: 24/34 MS: 1 EraseBytes- 00:08:04.085 [2024-04-25 20:55:19.622639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.622668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.622795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.622815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.622941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.622960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.085 #10 NEW cov: 11896 ft: 13160 corp: 6/151b lim: 40 exec/s: 0 rss: 68Mb L: 24/34 MS: 1 ShuffleBytes- 00:08:04.085 [2024-04-25 20:55:19.672516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.672544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.672668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.672685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.085 #11 NEW cov: 11896 ft: 13500 corp: 7/173b lim: 40 exec/s: 0 rss: 69Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:08:04.085 [2024-04-25 20:55:19.722752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.722780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.085 [2024-04-25 20:55:19.722905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff0800 cdw11:08666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.085 [2024-04-25 20:55:19.722923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.345 #12 NEW cov: 11896 ft: 13590 corp: 8/195b lim: 40 exec/s: 0 rss: 69Mb L: 22/34 MS: 1 CrossOver- 00:08:04.345 [2024-04-25 20:55:19.772825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.772852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.772975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.772999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.345 #13 NEW cov: 11896 ft: 13612 corp: 9/217b lim: 40 exec/s: 0 rss: 69Mb L: 22/34 MS: 1 ChangeByte- 00:08:04.345 [2024-04-25 20:55:19.812770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.812797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.812922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.812940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.813074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66466666 cdw11:66000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.813092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.345 #14 NEW cov: 11896 ft: 13673 corp: 10/241b lim: 40 exec/s: 0 rss: 69Mb L: 24/34 MS: 1 ChangeBit- 00:08:04.345 [2024-04-25 20:55:19.873425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.873453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.873582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666608 cdw11:00086666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.873606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.873735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:6666669a cdw11:66666699 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.873755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.345 #15 NEW cov: 11896 ft: 13735 corp: 11/272b lim: 40 exec/s: 0 rss: 69Mb L: 31/34 MS: 1 CrossOver- 00:08:04.345 [2024-04-25 20:55:19.933020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.933047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.345 #16 NEW cov: 11896 ft: 14451 corp: 12/281b lim: 40 exec/s: 0 rss: 69Mb L: 9/34 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\017"- 00:08:04.345 [2024-04-25 20:55:19.973854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.973883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.974014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:666666a4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.974033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.974181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:a4a4a4a4 cdw11:a4666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.974199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.974333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.974351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.345 [2024-04-25 20:55:19.974483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:66666666 cdw11:66000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.345 [2024-04-25 20:55:19.974501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.345 #17 NEW cov: 11896 ft: 14537 corp: 13/321b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:04.605 [2024-04-25 20:55:20.013918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.013946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.014070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:9a999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.014090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.014215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:999999a3 cdw11:66000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.014233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.605 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.605 #18 NEW cov: 11919 ft: 14597 corp: 14/345b lim: 40 exec/s: 0 rss: 69Mb L: 24/40 MS: 1 ChangeBinInt- 00:08:04.605 [2024-04-25 20:55:20.053097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a01000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.053125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.605 #19 NEW cov: 11919 ft: 14628 corp: 15/354b lim: 40 exec/s: 0 rss: 69Mb L: 9/40 MS: 1 CopyPart- 00:08:04.605 [2024-04-25 20:55:20.103932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffffffe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.103960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.104088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff0800 cdw11:08666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.104108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.605 #20 NEW cov: 11919 ft: 14674 corp: 16/376b lim: 40 exec/s: 20 rss: 70Mb L: 22/40 MS: 1 ChangeBinInt- 00:08:04.605 [2024-04-25 20:55:20.154639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.154667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.154800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.154831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.154955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.154972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.155103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66660000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.155122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.605 #21 NEW cov: 11919 ft: 14691 corp: 17/410b lim: 40 exec/s: 21 rss: 70Mb L: 34/40 MS: 1 CrossOver- 00:08:04.605 [2024-04-25 20:55:20.193820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.193847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.193977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff0800 cdw11:083b6666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.193997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.605 #22 NEW cov: 11919 ft: 14711 corp: 18/433b lim: 40 exec/s: 22 rss: 70Mb L: 23/40 MS: 1 InsertByte- 00:08:04.605 [2024-04-25 20:55:20.244251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff09ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.244277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.605 [2024-04-25 20:55:20.244407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.605 [2024-04-25 20:55:20.244428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.605 #23 NEW cov: 11919 ft: 14745 corp: 19/456b lim: 40 exec/s: 23 rss: 70Mb L: 23/40 MS: 1 InsertByte- 00:08:04.865 [2024-04-25 20:55:20.284444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0100 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.284471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.284600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000fffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.284617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.865 #24 NEW cov: 11919 ft: 14759 corp: 20/478b lim: 40 exec/s: 24 rss: 70Mb L: 22/40 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\017"- 00:08:04.865 [2024-04-25 20:55:20.324832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.324860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.324988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.325011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.325141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.325160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.325291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.325312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.865 #25 NEW cov: 11919 ft: 14813 corp: 21/513b lim: 40 exec/s: 25 rss: 70Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:08:04.865 [2024-04-25 20:55:20.374641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.374670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.374789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.374808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.865 #26 NEW cov: 11919 ft: 14826 corp: 22/536b lim: 40 exec/s: 26 rss: 70Mb L: 23/40 MS: 1 InsertByte- 00:08:04.865 [2024-04-25 20:55:20.414860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.414889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.415022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666608 cdw11:00089999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.415040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.865 #27 NEW cov: 11919 ft: 14841 corp: 23/554b lim: 40 exec/s: 27 rss: 70Mb L: 18/40 MS: 1 EraseBytes- 00:08:04.865 [2024-04-25 20:55:20.464716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.464746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.865 #28 NEW cov: 11919 ft: 14861 corp: 24/563b lim: 40 exec/s: 28 rss: 70Mb L: 9/40 MS: 1 ChangeBit- 00:08:04.865 [2024-04-25 20:55:20.515649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.515676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.515801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.515821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.515948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.515966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.865 [2024-04-25 20:55:20.516097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.865 [2024-04-25 20:55:20.516116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.124 #29 NEW cov: 11919 ft: 14870 corp: 25/599b lim: 40 exec/s: 29 rss: 70Mb L: 36/40 MS: 1 CrossOver- 00:08:05.125 [2024-04-25 20:55:20.575244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:fffdffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.125 [2024-04-25 20:55:20.575273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.125 [2024-04-25 20:55:20.575404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff0800 cdw11:083b6666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.125 [2024-04-25 20:55:20.575423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.125 #30 NEW cov: 11919 ft: 14888 corp: 26/622b lim: 40 exec/s: 30 rss: 70Mb L: 23/40 MS: 1 ChangeBit- 00:08:05.125 [2024-04-25 20:55:20.635089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a0100e3 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.125 [2024-04-25 20:55:20.635117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.125 #31 NEW cov: 11919 ft: 14902 corp: 27/631b lim: 40 exec/s: 31 rss: 70Mb L: 9/40 MS: 1 ChangeByte- 00:08:05.125 [2024-04-25 20:55:20.695407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.125 [2024-04-25 20:55:20.695435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.125 #32 NEW cov: 11919 ft: 14910 corp: 28/640b lim: 40 exec/s: 32 rss: 70Mb L: 9/40 MS: 1 ChangeBit- 00:08:05.125 [2024-04-25 20:55:20.745544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.125 [2024-04-25 20:55:20.745572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.125 #33 NEW cov: 11919 ft: 14926 corp: 29/655b lim: 40 exec/s: 33 rss: 70Mb L: 15/40 MS: 1 EraseBytes- 00:08:05.385 [2024-04-25 20:55:20.795931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.795959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.796103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.796123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.385 #34 NEW cov: 11919 ft: 14927 corp: 30/676b lim: 40 exec/s: 34 rss: 70Mb L: 21/40 MS: 1 EraseBytes- 00:08:05.385 [2024-04-25 20:55:20.846624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.846651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.846773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:08000866 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.846791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.846918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.846934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.847083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.847103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.385 #35 NEW cov: 11919 ft: 14944 corp: 31/715b lim: 40 exec/s: 35 rss: 70Mb L: 39/40 MS: 1 CrossOver- 00:08:05.385 [2024-04-25 20:55:20.896357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.896382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.896509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:666666ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.896527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.896653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff66 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.896668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.896801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.896819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.385 #36 NEW cov: 11919 ft: 15020 corp: 32/751b lim: 40 exec/s: 36 rss: 70Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:05.385 [2024-04-25 20:55:20.937166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.937191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.937317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666608 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.937335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.937463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.937482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.937600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.937616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.937742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:ffffff08 cdw11:99999266 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.937761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.385 #37 NEW cov: 11919 ft: 15028 corp: 33/791b lim: 40 exec/s: 37 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:05.385 [2024-04-25 20:55:20.976647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08000866 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.976675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.976803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:66666666 cdw11:08000866 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.385 [2024-04-25 20:55:20.976820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.385 [2024-04-25 20:55:20.976943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.386 [2024-04-25 20:55:20.976959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.386 [2024-04-25 20:55:20.977115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:66666666 cdw11:66666666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.386 [2024-04-25 20:55:20.977136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.386 #38 NEW cov: 11919 ft: 15032 corp: 34/828b lim: 40 exec/s: 38 rss: 70Mb L: 37/40 MS: 1 CrossOver- 00:08:05.386 [2024-04-25 20:55:21.037195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.386 [2024-04-25 20:55:21.037223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.386 [2024-04-25 20:55:21.037354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.386 [2024-04-25 20:55:21.037371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.386 [2024-04-25 20:55:21.037497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.386 [2024-04-25 20:55:21.037514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.386 [2024-04-25 20:55:21.037652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:000a0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.386 [2024-04-25 20:55:21.037669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.645 #39 NEW cov: 11919 ft: 15040 corp: 35/862b lim: 40 exec/s: 39 rss: 70Mb L: 34/40 MS: 1 EraseBytes- 00:08:05.645 [2024-04-25 20:55:21.096805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.645 [2024-04-25 20:55:21.096832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.645 [2024-04-25 20:55:21.096974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff0800 cdw11:083b6666 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.645 [2024-04-25 20:55:21.096997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.645 #40 NEW cov: 11919 ft: 15057 corp: 36/879b lim: 40 exec/s: 20 rss: 70Mb L: 17/40 MS: 1 EraseBytes- 00:08:05.645 #40 DONE cov: 11919 ft: 15057 corp: 36/879b lim: 40 exec/s: 20 rss: 70Mb 00:08:05.646 ###### Recommended dictionary. ###### 00:08:05.646 "\001\000\000\000\000\000\000\017" # Uses: 1 00:08:05.646 ###### End of recommended dictionary. ###### 00:08:05.646 Done 40 runs in 2 second(s) 00:08:05.646 20:55:21 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.646 20:55:21 -- ../common.sh@72 -- # (( i++ )) 00:08:05.646 20:55:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.646 20:55:21 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:05.646 20:55:21 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:05.646 20:55:21 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.646 20:55:21 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.646 20:55:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.646 20:55:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:05.646 20:55:21 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.646 20:55:21 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.646 20:55:21 -- nvmf/run.sh@34 -- # printf %02d 13 00:08:05.646 20:55:21 -- nvmf/run.sh@34 -- # port=4413 00:08:05.646 20:55:21 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.646 20:55:21 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:05.646 20:55:21 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.646 20:55:21 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.646 20:55:21 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.646 20:55:21 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:05.646 [2024-04-25 20:55:21.263005] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:05.646 [2024-04-25 20:55:21.263099] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid197978 ] 00:08:05.646 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.904 [2024-04-25 20:55:21.402070] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:05.904 [2024-04-25 20:55:21.438191] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.904 [2024-04-25 20:55:21.457185] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.904 [2024-04-25 20:55:21.509234] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.904 [2024-04-25 20:55:21.525498] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:05.904 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.904 INFO: Seed: 271142466 00:08:05.904 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:05.904 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:05.904 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.904 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.904 #2 INITED exec/s: 0 rss: 61Mb 00:08:05.904 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.904 This may also happen if the target rejected all inputs we tried so far 00:08:06.162 [2024-04-25 20:55:21.570966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.162 [2024-04-25 20:55:21.570997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.162 [2024-04-25 20:55:21.571069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.162 [2024-04-25 20:55:21.571083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.162 [2024-04-25 20:55:21.571149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.162 [2024-04-25 20:55:21.571162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.421 NEW_FUNC[1/670]: 0x4b6170 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:06.421 NEW_FUNC[2/670]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.421 #3 NEW cov: 11663 ft: 11664 corp: 2/28b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:06.422 [2024-04-25 20:55:21.902861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:21.902907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:21.903054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:21.903077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:21.903218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:21.903241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.422 #9 NEW cov: 11793 ft: 12295 corp: 3/56b lim: 40 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 InsertByte- 00:08:06.422 [2024-04-25 20:55:21.962831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:21.962861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:21.962997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:21.963017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:21.963151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:21.963170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.422 #10 NEW cov: 11799 ft: 12594 corp: 4/84b lim: 40 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 InsertByte- 00:08:06.422 [2024-04-25 20:55:22.013094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:22.013122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:22.013244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:22.013264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:22.013392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:22.013409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.422 #11 NEW cov: 11884 ft: 12785 corp: 5/115b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:06.422 [2024-04-25 20:55:22.073323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:22.073352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:22.073493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:22.073512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.422 [2024-04-25 20:55:22.073642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.422 [2024-04-25 20:55:22.073662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.681 #12 NEW cov: 11884 ft: 12929 corp: 6/143b lim: 40 exec/s: 0 rss: 68Mb L: 28/31 MS: 1 ChangeBit- 00:08:06.681 [2024-04-25 20:55:22.123523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a202f00 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.123551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.123686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.123705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.123851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.123870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.681 #13 NEW cov: 11884 ft: 12999 corp: 7/174b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBit- 00:08:06.681 [2024-04-25 20:55:22.183890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.183924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.184065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.184083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.184223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.184241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.184374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.184392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.681 #14 NEW cov: 11884 ft: 13524 corp: 8/210b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:06.681 [2024-04-25 20:55:22.243796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.243828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.243971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.243989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.244134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.244153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.681 #15 NEW cov: 11884 ft: 13535 corp: 9/238b lim: 40 exec/s: 0 rss: 69Mb L: 28/36 MS: 1 ChangeByte- 00:08:06.681 [2024-04-25 20:55:22.294171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.294201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.294333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:2f000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.294350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.294482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.294502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.681 [2024-04-25 20:55:22.294628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.681 [2024-04-25 20:55:22.294647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.681 #16 NEW cov: 11884 ft: 13619 corp: 10/275b lim: 40 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 CrossOver- 00:08:06.940 [2024-04-25 20:55:22.344384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.344414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.344553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.344570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.344699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.344717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.344861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.344880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.940 #17 NEW cov: 11884 ft: 13642 corp: 11/313b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CMP- DE: "\001\005"- 00:08:06.940 [2024-04-25 20:55:22.403990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.404021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.404160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.404177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.940 #18 NEW cov: 11884 ft: 13873 corp: 12/330b lim: 40 exec/s: 0 rss: 69Mb L: 17/38 MS: 1 EraseBytes- 00:08:06.940 [2024-04-25 20:55:22.454402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00fe0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.454428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.454569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.454587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.454713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.454732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.940 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.940 #19 NEW cov: 11907 ft: 13905 corp: 13/358b lim: 40 exec/s: 0 rss: 69Mb L: 28/38 MS: 1 ChangeBinInt- 00:08:06.940 [2024-04-25 20:55:22.514569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.514597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.514726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.514747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.940 [2024-04-25 20:55:22.514874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.514893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.940 #20 NEW cov: 11907 ft: 13928 corp: 14/387b lim: 40 exec/s: 0 rss: 69Mb L: 29/38 MS: 1 EraseBytes- 00:08:06.940 [2024-04-25 20:55:22.564253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f0a cdw11:002f0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.940 [2024-04-25 20:55:22.564280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.940 #21 NEW cov: 11907 ft: 14302 corp: 15/398b lim: 40 exec/s: 21 rss: 69Mb L: 11/38 MS: 1 CrossOver- 00:08:07.199 [2024-04-25 20:55:22.614860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.199 [2024-04-25 20:55:22.614889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.199 [2024-04-25 20:55:22.615024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.615042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.200 [2024-04-25 20:55:22.615166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.615182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.200 #22 NEW cov: 11907 ft: 14337 corp: 16/427b lim: 40 exec/s: 22 rss: 69Mb L: 29/38 MS: 1 InsertByte- 00:08:07.200 [2024-04-25 20:55:22.665330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.665357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.200 [2024-04-25 20:55:22.665497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.665516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.200 [2024-04-25 20:55:22.665653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.665670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.200 [2024-04-25 20:55:22.665803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff00003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.665820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.200 #23 NEW cov: 11907 ft: 14352 corp: 17/466b lim: 40 exec/s: 23 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:07.200 [2024-04-25 20:55:22.714749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.714776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.200 #24 NEW cov: 11907 ft: 14365 corp: 18/478b lim: 40 exec/s: 24 rss: 69Mb L: 12/39 MS: 1 EraseBytes- 00:08:07.200 [2024-04-25 20:55:22.765316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.765343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.200 [2024-04-25 20:55:22.765470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.765490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.200 [2024-04-25 20:55:22.765614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000041 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.765632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.200 #25 NEW cov: 11907 ft: 14387 corp: 19/508b lim: 40 exec/s: 25 rss: 69Mb L: 30/39 MS: 1 InsertByte- 00:08:07.200 [2024-04-25 20:55:22.815082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.200 [2024-04-25 20:55:22.815110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.200 #26 NEW cov: 11907 ft: 14413 corp: 20/520b lim: 40 exec/s: 26 rss: 70Mb L: 12/39 MS: 1 CopyPart- 00:08:07.459 [2024-04-25 20:55:22.865892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000105 cdw11:2f00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.865918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:22.866047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.866066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:22.866199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.866216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:22.866344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.866362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.459 #27 NEW cov: 11907 ft: 14449 corp: 21/553b lim: 40 exec/s: 27 rss: 70Mb L: 33/39 MS: 1 PersAutoDict- DE: "\001\005"- 00:08:07.459 [2024-04-25 20:55:22.915353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:40000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.915382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.459 #28 NEW cov: 11907 ft: 14473 corp: 22/565b lim: 40 exec/s: 28 rss: 70Mb L: 12/39 MS: 1 ChangeBit- 00:08:07.459 [2024-04-25 20:55:22.966097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.966122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:22.966244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.966264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:22.966393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.966411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:22.966546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:22.966563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.459 #29 NEW cov: 11907 ft: 14506 corp: 23/601b lim: 40 exec/s: 29 rss: 70Mb L: 36/39 MS: 1 CrossOver- 00:08:07.459 [2024-04-25 20:55:23.005323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a202f00 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:23.005350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:23.005486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:23.005504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.459 #30 NEW cov: 11907 ft: 14537 corp: 24/620b lim: 40 exec/s: 30 rss: 70Mb L: 19/39 MS: 1 EraseBytes- 00:08:07.459 [2024-04-25 20:55:23.056135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f01 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:23.056161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:23.056289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.459 [2024-04-25 20:55:23.056307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.459 [2024-04-25 20:55:23.056441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.460 [2024-04-25 20:55:23.056459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.460 #31 NEW cov: 11907 ft: 14551 corp: 25/648b lim: 40 exec/s: 31 rss: 70Mb L: 28/39 MS: 1 ChangeBit- 00:08:07.460 [2024-04-25 20:55:23.106324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a202f00 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.460 [2024-04-25 20:55:23.106352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.460 [2024-04-25 20:55:23.106478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.460 [2024-04-25 20:55:23.106496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.460 [2024-04-25 20:55:23.106628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:05000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.460 [2024-04-25 20:55:23.106646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.718 #32 NEW cov: 11907 ft: 14557 corp: 26/679b lim: 40 exec/s: 32 rss: 70Mb L: 31/39 MS: 1 PersAutoDict- DE: "\001\005"- 00:08:07.718 [2024-04-25 20:55:23.156249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a202f00 cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.156276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.156409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.156427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.718 #33 NEW cov: 11907 ft: 14598 corp: 27/698b lim: 40 exec/s: 33 rss: 70Mb L: 19/39 MS: 1 CopyPart- 00:08:07.718 [2024-04-25 20:55:23.206777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.206806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.206937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.206954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.207090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.207108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.207234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff2dffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.207252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.718 #34 NEW cov: 11907 ft: 14615 corp: 28/735b lim: 40 exec/s: 34 rss: 70Mb L: 37/39 MS: 1 InsertByte- 00:08:07.718 [2024-04-25 20:55:23.266412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:407e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.266441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.718 #35 NEW cov: 11907 ft: 14675 corp: 29/748b lim: 40 exec/s: 35 rss: 70Mb L: 13/39 MS: 1 InsertByte- 00:08:07.718 [2024-04-25 20:55:23.327000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.327028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.327171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.327189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.327314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:08003a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.327331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.718 #36 NEW cov: 11907 ft: 14677 corp: 30/777b lim: 40 exec/s: 36 rss: 70Mb L: 29/39 MS: 1 InsertByte- 00:08:07.718 [2024-04-25 20:55:23.377493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.377523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.377656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.377675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.377794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.377811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.718 [2024-04-25 20:55:23.377942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ff00003e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.718 [2024-04-25 20:55:23.377961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.978 #37 NEW cov: 11907 ft: 14690 corp: 31/816b lim: 40 exec/s: 37 rss: 70Mb L: 39/39 MS: 1 CrossOver- 00:08:07.978 [2024-04-25 20:55:23.437483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.437516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.978 [2024-04-25 20:55:23.437650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.437670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.978 [2024-04-25 20:55:23.437798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.437817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.978 #38 NEW cov: 11907 ft: 14696 corp: 32/841b lim: 40 exec/s: 38 rss: 70Mb L: 25/39 MS: 1 CrossOver- 00:08:07.978 [2024-04-25 20:55:23.487415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f00 cdw11:00003b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.487442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.978 [2024-04-25 20:55:23.487567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.487584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.978 #39 NEW cov: 11907 ft: 14702 corp: 33/858b lim: 40 exec/s: 39 rss: 70Mb L: 17/39 MS: 1 ChangeByte- 00:08:07.978 [2024-04-25 20:55:23.537258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a010500 cdw11:2f000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.537287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.978 #40 NEW cov: 11907 ft: 14779 corp: 34/872b lim: 40 exec/s: 40 rss: 70Mb L: 14/39 MS: 1 PersAutoDict- DE: "\001\005"- 00:08:07.978 [2024-04-25 20:55:23.587889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a002f01 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.587921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.978 [2024-04-25 20:55:23.588057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.588076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.978 [2024-04-25 20:55:23.588213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.978 [2024-04-25 20:55:23.588232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.978 #41 NEW cov: 11907 ft: 14792 corp: 35/902b lim: 40 exec/s: 20 rss: 70Mb L: 30/39 MS: 1 PersAutoDict- DE: "\001\005"- 00:08:07.978 #41 DONE cov: 11907 ft: 14792 corp: 35/902b lim: 40 exec/s: 20 rss: 70Mb 00:08:07.978 ###### Recommended dictionary. ###### 00:08:07.978 "\001\005" # Uses: 4 00:08:07.978 ###### End of recommended dictionary. ###### 00:08:07.978 Done 41 runs in 2 second(s) 00:08:08.238 20:55:23 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.238 20:55:23 -- ../common.sh@72 -- # (( i++ )) 00:08:08.238 20:55:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.238 20:55:23 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:08.238 20:55:23 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:08.238 20:55:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.238 20:55:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.238 20:55:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.238 20:55:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:08.238 20:55:23 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.238 20:55:23 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.238 20:55:23 -- nvmf/run.sh@34 -- # printf %02d 14 00:08:08.238 20:55:23 -- nvmf/run.sh@34 -- # port=4414 00:08:08.238 20:55:23 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.238 20:55:23 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:08.238 20:55:23 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.238 20:55:23 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.238 20:55:23 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.238 20:55:23 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:08.238 [2024-04-25 20:55:23.763991] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:08.238 [2024-04-25 20:55:23.764076] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198467 ] 00:08:08.238 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.498 [2024-04-25 20:55:23.909060] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:08.498 [2024-04-25 20:55:23.945349] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.498 [2024-04-25 20:55:23.964544] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.498 [2024-04-25 20:55:24.016578] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.498 [2024-04-25 20:55:24.032846] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:08.498 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.498 INFO: Seed: 2778180235 00:08:08.498 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:08.498 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:08.498 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.498 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.498 #2 INITED exec/s: 0 rss: 60Mb 00:08:08.498 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.498 This may also happen if the target rejected all inputs we tried so far 00:08:08.498 [2024-04-25 20:55:24.088156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.498 [2024-04-25 20:55:24.088183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.757 NEW_FUNC[1/671]: 0x4b7d30 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:08.757 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.758 #4 NEW cov: 11657 ft: 11649 corp: 2/10b lim: 35 exec/s: 0 rss: 68Mb L: 9/9 MS: 2 ChangeByte-CMP- DE: "\000\000\000\000\000\000\000\034"- 00:08:08.758 [2024-04-25 20:55:24.409027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.758 [2024-04-25 20:55:24.409057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.758 [2024-04-25 20:55:24.409114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.758 [2024-04-25 20:55:24.409128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.017 #10 NEW cov: 11787 ft: 12885 corp: 3/26b lim: 35 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:08:09.017 [2024-04-25 20:55:24.458943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.458968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.017 #11 NEW cov: 11793 ft: 13091 corp: 4/36b lim: 35 exec/s: 0 rss: 68Mb L: 10/16 MS: 1 InsertByte- 00:08:09.017 [2024-04-25 20:55:24.499164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.499189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.017 [2024-04-25 20:55:24.499262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.499275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.017 #12 NEW cov: 11878 ft: 13373 corp: 5/53b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:08:09.017 NEW_FUNC[1/2]: 0x4d91f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:09.017 NEW_FUNC[2/2]: 0x1197a10 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1709 00:08:09.017 #13 NEW cov: 11911 ft: 13530 corp: 6/64b lim: 35 exec/s: 0 rss: 69Mb L: 11/17 MS: 1 CrossOver- 00:08:09.017 [2024-04-25 20:55:24.589466] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.589491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.017 [2024-04-25 20:55:24.589547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.589563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.017 #14 NEW cov: 11911 ft: 13595 corp: 7/81b lim: 35 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 InsertByte- 00:08:09.017 [2024-04-25 20:55:24.629414] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.629439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.017 #15 NEW cov: 11911 ft: 13654 corp: 8/92b lim: 35 exec/s: 0 rss: 69Mb L: 11/17 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\034"- 00:08:09.017 [2024-04-25 20:55:24.669729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.669754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.017 [2024-04-25 20:55:24.669812] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.017 [2024-04-25 20:55:24.669826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.297 #16 NEW cov: 11911 ft: 13676 corp: 9/109b lim: 35 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 CopyPart- 00:08:09.297 #17 NEW cov: 11911 ft: 13723 corp: 10/120b lim: 35 exec/s: 0 rss: 69Mb L: 11/17 MS: 1 ShuffleBytes- 00:08:09.297 [2024-04-25 20:55:24.749931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.297 [2024-04-25 20:55:24.749956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.297 [2024-04-25 20:55:24.750012] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.297 [2024-04-25 20:55:24.750025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.297 #18 NEW cov: 11911 ft: 13783 corp: 11/137b lim: 35 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 ChangeByte- 00:08:09.297 #19 NEW cov: 11911 ft: 13830 corp: 12/148b lim: 35 exec/s: 0 rss: 69Mb L: 11/17 MS: 1 ChangeBit- 00:08:09.297 [2024-04-25 20:55:24.820337] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.297 [2024-04-25 20:55:24.820362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.297 [2024-04-25 20:55:24.820421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.297 [2024-04-25 20:55:24.820434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.297 [2024-04-25 20:55:24.820491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.297 [2024-04-25 20:55:24.820504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.297 #20 NEW cov: 11911 ft: 14074 corp: 13/173b lim: 35 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\034"- 00:08:09.297 #21 NEW cov: 11911 ft: 14090 corp: 14/184b lim: 35 exec/s: 0 rss: 69Mb L: 11/25 MS: 1 CMP- DE: "\376\377"- 00:08:09.297 #22 NEW cov: 11911 ft: 14097 corp: 15/195b lim: 35 exec/s: 0 rss: 69Mb L: 11/25 MS: 1 CrossOver- 00:08:09.561 #23 NEW cov: 11911 ft: 14127 corp: 16/205b lim: 35 exec/s: 0 rss: 70Mb L: 10/25 MS: 1 EraseBytes- 00:08:09.561 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.561 #24 NEW cov: 11934 ft: 14187 corp: 17/216b lim: 35 exec/s: 0 rss: 70Mb L: 11/25 MS: 1 CopyPart- 00:08:09.561 [2024-04-25 20:55:25.021003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.021032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.561 [2024-04-25 20:55:25.021090] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.021103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.561 [2024-04-25 20:55:25.021159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.021173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.561 [2024-04-25 20:55:25.021228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.021241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.561 #25 NEW cov: 11934 ft: 14497 corp: 18/248b lim: 35 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:09.561 [2024-04-25 20:55:25.070693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.070718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.561 #26 NEW cov: 11934 ft: 14504 corp: 19/256b lim: 35 exec/s: 26 rss: 70Mb L: 8/32 MS: 1 EraseBytes- 00:08:09.561 [2024-04-25 20:55:25.110809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.110833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.561 #29 NEW cov: 11934 ft: 14512 corp: 20/268b lim: 35 exec/s: 29 rss: 70Mb L: 12/32 MS: 3 ShuffleBytes-CrossOver-CrossOver- 00:08:09.561 #30 NEW cov: 11934 ft: 14569 corp: 21/276b lim: 35 exec/s: 30 rss: 70Mb L: 8/32 MS: 1 EraseBytes- 00:08:09.561 #31 NEW cov: 11934 ft: 14596 corp: 22/284b lim: 35 exec/s: 31 rss: 70Mb L: 8/32 MS: 1 ChangeBit- 00:08:09.561 [2024-04-25 20:55:25.211535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.211561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.561 [2024-04-25 20:55:25.211619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.211635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.561 [2024-04-25 20:55:25.211693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.211706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.561 [2024-04-25 20:55:25.211764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.561 [2024-04-25 20:55:25.211777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.820 #32 NEW cov: 11934 ft: 14655 corp: 23/316b lim: 35 exec/s: 32 rss: 70Mb L: 32/32 MS: 1 CrossOver- 00:08:09.820 [2024-04-25 20:55:25.261722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.261748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.820 [2024-04-25 20:55:25.261809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.261823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.820 [2024-04-25 20:55:25.261878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.261892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.820 [2024-04-25 20:55:25.261949] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.261962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.820 #33 NEW cov: 11934 ft: 14716 corp: 24/348b lim: 35 exec/s: 33 rss: 70Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:09.820 #39 NEW cov: 11934 ft: 14728 corp: 25/361b lim: 35 exec/s: 39 rss: 70Mb L: 13/32 MS: 1 PersAutoDict- DE: "\376\377"- 00:08:09.820 [2024-04-25 20:55:25.331406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.331431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.820 #40 NEW cov: 11934 ft: 14740 corp: 26/372b lim: 35 exec/s: 40 rss: 70Mb L: 11/32 MS: 1 ChangeBinInt- 00:08:09.820 [2024-04-25 20:55:25.371540] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.371568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.820 #41 NEW cov: 11941 ft: 14808 corp: 27/383b lim: 35 exec/s: 41 rss: 70Mb L: 11/32 MS: 1 CMP- DE: "7\264&\334\376\374v\000"- 00:08:09.820 [2024-04-25 20:55:25.411624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.820 [2024-04-25 20:55:25.411648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.820 #42 NEW cov: 11941 ft: 14845 corp: 28/395b lim: 35 exec/s: 42 rss: 70Mb L: 12/32 MS: 1 InsertByte- 00:08:09.820 #43 NEW cov: 11941 ft: 14857 corp: 29/406b lim: 35 exec/s: 43 rss: 70Mb L: 11/32 MS: 1 ChangeBit- 00:08:10.079 #44 NEW cov: 11941 ft: 14866 corp: 30/417b lim: 35 exec/s: 44 rss: 70Mb L: 11/32 MS: 1 ChangeByte- 00:08:10.079 [2024-04-25 20:55:25.512077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.079 [2024-04-25 20:55:25.512102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.079 [2024-04-25 20:55:25.512159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.079 [2024-04-25 20:55:25.512173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.079 #45 NEW cov: 11941 ft: 14902 corp: 31/434b lim: 35 exec/s: 45 rss: 70Mb L: 17/32 MS: 1 CMP- DE: "\000\000\000\001"- 00:08:10.079 #46 NEW cov: 11941 ft: 14917 corp: 32/445b lim: 35 exec/s: 46 rss: 70Mb L: 11/32 MS: 1 ChangeBinInt- 00:08:10.079 [2024-04-25 20:55:25.582632] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.079 [2024-04-25 20:55:25.582658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.079 [2024-04-25 20:55:25.582718] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.079 [2024-04-25 20:55:25.582733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.079 [2024-04-25 20:55:25.582796] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.079 [2024-04-25 20:55:25.582810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.079 #47 NEW cov: 11941 ft: 15017 corp: 33/473b lim: 35 exec/s: 47 rss: 70Mb L: 28/32 MS: 1 InsertRepeatedBytes- 00:08:10.079 #48 NEW cov: 11941 ft: 15022 corp: 34/481b lim: 35 exec/s: 48 rss: 70Mb L: 8/32 MS: 1 ChangeBinInt- 00:08:10.079 [2024-04-25 20:55:25.672383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.079 [2024-04-25 20:55:25.672408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.079 #49 NEW cov: 11941 ft: 15055 corp: 35/493b lim: 35 exec/s: 49 rss: 70Mb L: 12/32 MS: 1 ChangeByte- 00:08:10.079 #50 NEW cov: 11941 ft: 15062 corp: 36/504b lim: 35 exec/s: 50 rss: 70Mb L: 11/32 MS: 1 CrossOver- 00:08:10.339 [2024-04-25 20:55:25.752602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.752626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.339 #55 NEW cov: 11941 ft: 15069 corp: 37/512b lim: 35 exec/s: 55 rss: 70Mb L: 8/32 MS: 5 CrossOver-InsertByte-CopyPart-InsertByte-CrossOver- 00:08:10.339 [2024-04-25 20:55:25.783152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.783176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.339 [2024-04-25 20:55:25.783237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.783251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.339 [2024-04-25 20:55:25.783309] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.783322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.339 [2024-04-25 20:55:25.783381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.783394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.339 #56 NEW cov: 11941 ft: 15104 corp: 38/545b lim: 35 exec/s: 56 rss: 70Mb L: 33/33 MS: 1 InsertByte- 00:08:10.339 #60 NEW cov: 11941 ft: 15131 corp: 39/558b lim: 35 exec/s: 60 rss: 70Mb L: 13/33 MS: 4 EraseBytes-CrossOver-CrossOver-CrossOver- 00:08:10.339 #61 NEW cov: 11941 ft: 15136 corp: 40/569b lim: 35 exec/s: 61 rss: 70Mb L: 11/33 MS: 1 CopyPart- 00:08:10.339 [2024-04-25 20:55:25.893017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.893042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.339 #67 NEW cov: 11941 ft: 15138 corp: 41/581b lim: 35 exec/s: 67 rss: 70Mb L: 12/33 MS: 1 ChangeByte- 00:08:10.339 [2024-04-25 20:55:25.933335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.933360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.339 [2024-04-25 20:55:25.933419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.933437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.339 #68 NEW cov: 11941 ft: 15150 corp: 42/598b lim: 35 exec/s: 68 rss: 70Mb L: 17/33 MS: 1 CopyPart- 00:08:10.339 [2024-04-25 20:55:25.973390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.973414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.339 [2024-04-25 20:55:25.973469] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.339 [2024-04-25 20:55:25.973482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.339 #69 NEW cov: 11941 ft: 15155 corp: 43/615b lim: 35 exec/s: 69 rss: 70Mb L: 17/33 MS: 1 ShuffleBytes- 00:08:10.599 [2024-04-25 20:55:26.013672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.599 [2024-04-25 20:55:26.013696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.599 [2024-04-25 20:55:26.013752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.599 [2024-04-25 20:55:26.013765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.599 [2024-04-25 20:55:26.013819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.599 [2024-04-25 20:55:26.013832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.599 #70 NEW cov: 11941 ft: 15160 corp: 44/636b lim: 35 exec/s: 70 rss: 70Mb L: 21/33 MS: 1 InsertRepeatedBytes- 00:08:10.599 [2024-04-25 20:55:26.053473] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.599 [2024-04-25 20:55:26.053497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.599 #71 NEW cov: 11941 ft: 15208 corp: 45/644b lim: 35 exec/s: 35 rss: 70Mb L: 8/33 MS: 1 ChangeBit- 00:08:10.599 #71 DONE cov: 11941 ft: 15208 corp: 45/644b lim: 35 exec/s: 35 rss: 70Mb 00:08:10.599 ###### Recommended dictionary. ###### 00:08:10.599 "\000\000\000\000\000\000\000\034" # Uses: 2 00:08:10.599 "\376\377" # Uses: 2 00:08:10.599 "7\264&\334\376\374v\000" # Uses: 0 00:08:10.600 "\000\000\000\001" # Uses: 0 00:08:10.600 ###### End of recommended dictionary. ###### 00:08:10.600 Done 71 runs in 2 second(s) 00:08:10.600 20:55:26 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:10.600 20:55:26 -- ../common.sh@72 -- # (( i++ )) 00:08:10.600 20:55:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.600 20:55:26 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:10.600 20:55:26 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:10.600 20:55:26 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.600 20:55:26 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.600 20:55:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:10.600 20:55:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:10.600 20:55:26 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:10.600 20:55:26 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:10.600 20:55:26 -- nvmf/run.sh@34 -- # printf %02d 15 00:08:10.600 20:55:26 -- nvmf/run.sh@34 -- # port=4415 00:08:10.600 20:55:26 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:10.600 20:55:26 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:10.600 20:55:26 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.600 20:55:26 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:10.600 20:55:26 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:10.600 20:55:26 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:10.600 [2024-04-25 20:55:26.218446] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:10.600 [2024-04-25 20:55:26.218530] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid198799 ] 00:08:10.600 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.859 [2024-04-25 20:55:26.358841] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:10.859 [2024-04-25 20:55:26.396817] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.859 [2024-04-25 20:55:26.415957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.859 [2024-04-25 20:55:26.468193] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.859 [2024-04-25 20:55:26.484517] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:10.859 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.859 INFO: Seed: 933181186 00:08:11.119 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:11.119 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:11.119 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.119 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.119 #2 INITED exec/s: 0 rss: 60Mb 00:08:11.119 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.119 This may also happen if the target rejected all inputs we tried so far 00:08:11.119 [2024-04-25 20:55:26.554866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.119 [2024-04-25 20:55:26.554902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.119 [2024-04-25 20:55:26.555058] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.119 [2024-04-25 20:55:26.555078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.119 [2024-04-25 20:55:26.555223] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.119 [2024-04-25 20:55:26.555243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.119 [2024-04-25 20:55:26.555375] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.119 [2024-04-25 20:55:26.555393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.378 NEW_FUNC[1/670]: 0x4b9270 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:11.378 NEW_FUNC[2/670]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.378 #10 NEW cov: 11645 ft: 11646 corp: 2/30b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:11.378 NEW_FUNC[1/1]: 0x4d91f0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:11.378 #13 NEW cov: 11789 ft: 12812 corp: 3/37b lim: 35 exec/s: 0 rss: 68Mb L: 7/29 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:11.378 #14 NEW cov: 11795 ft: 12904 corp: 4/44b lim: 35 exec/s: 0 rss: 68Mb L: 7/29 MS: 1 ChangeBinInt- 00:08:11.378 [2024-04-25 20:55:26.996252] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.378 [2024-04-25 20:55:26.996289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.378 [2024-04-25 20:55:26.996435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.378 [2024-04-25 20:55:26.996465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.378 [2024-04-25 20:55:26.996611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.378 [2024-04-25 20:55:26.996628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.378 [2024-04-25 20:55:26.996778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.378 [2024-04-25 20:55:26.996797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.378 #15 NEW cov: 11880 ft: 13135 corp: 5/73b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ShuffleBytes- 00:08:11.638 [2024-04-25 20:55:27.056413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.056446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.056597] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.056618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.056779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.056799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.056956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.056975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.638 #16 NEW cov: 11880 ft: 13270 corp: 6/102b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 ChangeByte- 00:08:11.638 [2024-04-25 20:55:27.106668] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.106698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.106836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.106854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.106996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.107016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.107160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.107182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.638 #17 NEW cov: 11880 ft: 13364 corp: 7/134b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:11.638 [2024-04-25 20:55:27.156730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.156757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.156903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.156922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.157075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.157094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.157236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.157255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.638 #23 NEW cov: 11880 ft: 13430 corp: 8/168b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:08:11.638 [2024-04-25 20:55:27.217017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.217048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.217223] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.217245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.217403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.217422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.217569] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.217589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.638 #24 NEW cov: 11880 ft: 13461 corp: 9/197b lim: 35 exec/s: 0 rss: 69Mb L: 29/34 MS: 1 ChangeBinInt- 00:08:11.638 [2024-04-25 20:55:27.277032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.277061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.277216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.277237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.277403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.277422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.638 [2024-04-25 20:55:27.277588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.638 [2024-04-25 20:55:27.277609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.638 #25 NEW cov: 11880 ft: 13568 corp: 10/229b lim: 35 exec/s: 0 rss: 69Mb L: 32/34 MS: 1 ShuffleBytes- 00:08:11.897 [2024-04-25 20:55:27.327314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.327342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.327493] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.327511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.327664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.327683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.327840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.327859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.897 #26 NEW cov: 11880 ft: 13646 corp: 11/263b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:11.897 [2024-04-25 20:55:27.387406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.387435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.387599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.387617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.387772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.387790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.387937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.387955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.897 #27 NEW cov: 11880 ft: 13698 corp: 12/293b lim: 35 exec/s: 0 rss: 69Mb L: 30/34 MS: 1 CrossOver- 00:08:11.897 [2024-04-25 20:55:27.447691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.447720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.447883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.447901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.448062] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.897 [2024-04-25 20:55:27.448082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.897 [2024-04-25 20:55:27.448229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.448246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.898 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.898 #28 NEW cov: 11903 ft: 13797 corp: 13/327b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:11.898 [2024-04-25 20:55:27.497804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.497831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.898 [2024-04-25 20:55:27.497977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.497998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.898 [2024-04-25 20:55:27.498143] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.498164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.898 [2024-04-25 20:55:27.498316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.498337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.898 #29 NEW cov: 11903 ft: 13838 corp: 14/357b lim: 35 exec/s: 0 rss: 69Mb L: 30/34 MS: 1 InsertByte- 00:08:11.898 [2024-04-25 20:55:27.558022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.558049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.898 [2024-04-25 20:55:27.558201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.558220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.898 [2024-04-25 20:55:27.558374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.558391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.898 [2024-04-25 20:55:27.558534] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.898 [2024-04-25 20:55:27.558556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.157 #30 NEW cov: 11903 ft: 13896 corp: 15/386b lim: 35 exec/s: 30 rss: 69Mb L: 29/34 MS: 1 ChangeByte- 00:08:12.157 [2024-04-25 20:55:27.608134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.608162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.157 [2024-04-25 20:55:27.608317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.608338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.157 [2024-04-25 20:55:27.608484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.608503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.157 [2024-04-25 20:55:27.608657] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.608676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.157 #31 NEW cov: 11903 ft: 13945 corp: 16/420b lim: 35 exec/s: 31 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:08:12.157 [2024-04-25 20:55:27.658388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.658416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.157 [2024-04-25 20:55:27.658563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.658584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.157 [2024-04-25 20:55:27.658736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000743 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.658756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.157 [2024-04-25 20:55:27.658903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.658924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.157 #32 NEW cov: 11903 ft: 13950 corp: 17/451b lim: 35 exec/s: 32 rss: 70Mb L: 31/34 MS: 1 InsertByte- 00:08:12.157 [2024-04-25 20:55:27.707647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.707674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.157 #36 NEW cov: 11903 ft: 14084 corp: 18/463b lim: 35 exec/s: 36 rss: 70Mb L: 12/34 MS: 4 CopyPart-ChangeBit-CopyPart-CrossOver- 00:08:12.157 [2024-04-25 20:55:27.757862] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.157 [2024-04-25 20:55:27.757891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.157 #37 NEW cov: 11903 ft: 14093 corp: 19/472b lim: 35 exec/s: 37 rss: 70Mb L: 9/34 MS: 1 CrossOver- 00:08:12.416 #38 NEW cov: 11903 ft: 14098 corp: 20/479b lim: 35 exec/s: 38 rss: 70Mb L: 7/34 MS: 1 ChangeBit- 00:08:12.416 [2024-04-25 20:55:27.868932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.868962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.416 [2024-04-25 20:55:27.869106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000422 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.869127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.416 [2024-04-25 20:55:27.869284] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.869306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.416 [2024-04-25 20:55:27.869467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.869486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.416 #39 NEW cov: 11903 ft: 14148 corp: 21/513b lim: 35 exec/s: 39 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:12.416 [2024-04-25 20:55:27.919134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.919164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.416 [2024-04-25 20:55:27.919322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.919343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.416 [2024-04-25 20:55:27.919492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.416 [2024-04-25 20:55:27.919513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.417 [2024-04-25 20:55:27.919669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:27.919689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.417 #40 NEW cov: 11903 ft: 14164 corp: 22/546b lim: 35 exec/s: 40 rss: 70Mb L: 33/34 MS: 1 CrossOver- 00:08:12.417 [2024-04-25 20:55:27.968790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:27.968819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.417 [2024-04-25 20:55:27.968963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:27.968984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.417 #41 NEW cov: 11903 ft: 14441 corp: 23/563b lim: 35 exec/s: 41 rss: 70Mb L: 17/34 MS: 1 EraseBytes- 00:08:12.417 #42 NEW cov: 11903 ft: 14462 corp: 24/570b lim: 35 exec/s: 42 rss: 70Mb L: 7/34 MS: 1 ChangeByte- 00:08:12.417 [2024-04-25 20:55:28.069596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:28.069626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.417 [2024-04-25 20:55:28.069787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:28.069807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.417 [2024-04-25 20:55:28.069954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:28.069974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.417 [2024-04-25 20:55:28.070123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.417 [2024-04-25 20:55:28.070143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.676 #48 NEW cov: 11903 ft: 14522 corp: 25/604b lim: 35 exec/s: 48 rss: 70Mb L: 34/34 MS: 1 CopyPart- 00:08:12.676 [2024-04-25 20:55:28.119800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.119831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.676 [2024-04-25 20:55:28.119975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.119997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.676 [2024-04-25 20:55:28.120142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.120161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.676 [2024-04-25 20:55:28.120311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.120329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.676 #49 NEW cov: 11903 ft: 14538 corp: 26/633b lim: 35 exec/s: 49 rss: 70Mb L: 29/34 MS: 1 CopyPart- 00:08:12.676 [2024-04-25 20:55:28.169206] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.169234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.676 #50 NEW cov: 11903 ft: 14565 corp: 27/642b lim: 35 exec/s: 50 rss: 70Mb L: 9/34 MS: 1 ShuffleBytes- 00:08:12.676 [2024-04-25 20:55:28.230083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.230112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.676 [2024-04-25 20:55:28.230264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.230284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.676 [2024-04-25 20:55:28.230435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.230452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.676 [2024-04-25 20:55:28.230598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.230617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.676 #51 NEW cov: 11903 ft: 14566 corp: 28/676b lim: 35 exec/s: 51 rss: 70Mb L: 34/34 MS: 1 ChangeBit- 00:08:12.676 [2024-04-25 20:55:28.290202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.676 [2024-04-25 20:55:28.290233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.677 [2024-04-25 20:55:28.290394] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.677 [2024-04-25 20:55:28.290414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.677 [2024-04-25 20:55:28.290570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.677 [2024-04-25 20:55:28.290588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.677 [2024-04-25 20:55:28.290739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.677 [2024-04-25 20:55:28.290761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.677 #52 NEW cov: 11903 ft: 14666 corp: 29/706b lim: 35 exec/s: 52 rss: 70Mb L: 30/34 MS: 1 ChangeBit- 00:08:12.937 [2024-04-25 20:55:28.350517] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.350546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.350708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.350730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.350883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000004c5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.350902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.351053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.351075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.937 #53 NEW cov: 11903 ft: 14676 corp: 30/739b lim: 35 exec/s: 53 rss: 70Mb L: 33/34 MS: 1 InsertByte- 00:08:12.937 [2024-04-25 20:55:28.400706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.400736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.400897] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.400918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.401081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.401101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.401242] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000485 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.401261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.937 #54 NEW cov: 11903 ft: 14685 corp: 31/773b lim: 35 exec/s: 54 rss: 70Mb L: 34/34 MS: 1 ChangeBit- 00:08:12.937 [2024-04-25 20:55:28.450049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000003f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.450079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.937 #55 NEW cov: 11903 ft: 14716 corp: 32/782b lim: 35 exec/s: 55 rss: 70Mb L: 9/34 MS: 1 CMP- DE: "~X\205\244\000\375v\000"- 00:08:12.937 [2024-04-25 20:55:28.510553] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000070c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.510582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.937 [2024-04-25 20:55:28.510736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.937 [2024-04-25 20:55:28.510760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.937 #56 NEW cov: 11903 ft: 14735 corp: 33/800b lim: 35 exec/s: 28 rss: 70Mb L: 18/34 MS: 1 CrossOver- 00:08:12.937 #56 DONE cov: 11903 ft: 14735 corp: 33/800b lim: 35 exec/s: 28 rss: 70Mb 00:08:12.937 ###### Recommended dictionary. ###### 00:08:12.937 "~X\205\244\000\375v\000" # Uses: 0 00:08:12.937 ###### End of recommended dictionary. ###### 00:08:12.937 Done 56 runs in 2 second(s) 00:08:13.195 20:55:28 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.195 20:55:28 -- ../common.sh@72 -- # (( i++ )) 00:08:13.195 20:55:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.195 20:55:28 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:13.195 20:55:28 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:13.195 20:55:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:13.195 20:55:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.195 20:55:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.195 20:55:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:13.195 20:55:28 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.195 20:55:28 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.195 20:55:28 -- nvmf/run.sh@34 -- # printf %02d 16 00:08:13.195 20:55:28 -- nvmf/run.sh@34 -- # port=4416 00:08:13.195 20:55:28 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.195 20:55:28 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:13.196 20:55:28 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.196 20:55:28 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.196 20:55:28 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.196 20:55:28 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:13.196 [2024-04-25 20:55:28.672526] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:13.196 [2024-04-25 20:55:28.672580] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199329 ] 00:08:13.196 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.196 [2024-04-25 20:55:28.806014] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:13.196 [2024-04-25 20:55:28.844576] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.454 [2024-04-25 20:55:28.864712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.454 [2024-04-25 20:55:28.916810] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.454 [2024-04-25 20:55:28.933078] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:13.454 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.454 INFO: Seed: 3383168770 00:08:13.454 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:13.454 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:13.454 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.454 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.454 #2 INITED exec/s: 0 rss: 60Mb 00:08:13.454 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.454 This may also happen if the target rejected all inputs we tried so far 00:08:13.454 [2024-04-25 20:55:28.978442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2449473536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.455 [2024-04-25 20:55:28.978476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.455 [2024-04-25 20:55:28.978514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.455 [2024-04-25 20:55:28.978529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.455 [2024-04-25 20:55:28.978584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.455 [2024-04-25 20:55:28.978599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.455 [2024-04-25 20:55:28.978652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.455 [2024-04-25 20:55:28.978666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.713 NEW_FUNC[1/671]: 0x4ba720 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:13.713 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.713 #7 NEW cov: 11749 ft: 11750 corp: 2/94b lim: 105 exec/s: 0 rss: 68Mb L: 93/93 MS: 5 ChangeBit-ChangeBinInt-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:13.713 [2024-04-25 20:55:29.299340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2449473536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.299374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.713 [2024-04-25 20:55:29.299429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:17 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.299445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.713 [2024-04-25 20:55:29.299504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.299519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.713 [2024-04-25 20:55:29.299575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.299588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.713 #8 NEW cov: 11879 ft: 12212 corp: 3/187b lim: 105 exec/s: 0 rss: 68Mb L: 93/93 MS: 1 ChangeBit- 00:08:13.713 [2024-04-25 20:55:29.349410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.349439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.713 [2024-04-25 20:55:29.349481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.349496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.713 [2024-04-25 20:55:29.349551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.349567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.713 [2024-04-25 20:55:29.349624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.713 [2024-04-25 20:55:29.349639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.971 #9 NEW cov: 11885 ft: 12399 corp: 4/281b lim: 105 exec/s: 0 rss: 68Mb L: 94/94 MS: 1 InsertByte- 00:08:13.972 [2024-04-25 20:55:29.389225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.389252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.389290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.389305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.972 #15 NEW cov: 11970 ft: 13298 corp: 5/337b lim: 105 exec/s: 0 rss: 68Mb L: 56/94 MS: 1 CrossOver- 00:08:13.972 [2024-04-25 20:55:29.439439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14902075602529865422 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.439466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.439507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:14902075604643794638 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.439524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.972 #18 NEW cov: 11970 ft: 13381 corp: 6/382b lim: 105 exec/s: 0 rss: 68Mb L: 45/94 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:13.972 [2024-04-25 20:55:29.479754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.479782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.479828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.479844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.479899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.479914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.479972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.479987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.972 #19 NEW cov: 11970 ft: 13585 corp: 7/476b lim: 105 exec/s: 0 rss: 69Mb L: 94/94 MS: 1 ShuffleBytes- 00:08:13.972 [2024-04-25 20:55:29.519870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.519899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.519941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.519956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.520018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.520035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.520094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.520110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.972 #20 NEW cov: 11970 ft: 13647 corp: 8/579b lim: 105 exec/s: 0 rss: 69Mb L: 103/103 MS: 1 CopyPart- 00:08:13.972 [2024-04-25 20:55:29.560015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.560042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.560092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.560108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.560163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.560179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.560237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.560269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.972 #21 NEW cov: 11970 ft: 13683 corp: 9/683b lim: 105 exec/s: 0 rss: 69Mb L: 104/104 MS: 1 CrossOver- 00:08:13.972 [2024-04-25 20:55:29.599824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.599851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.972 [2024-04-25 20:55:29.599907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.972 [2024-04-25 20:55:29.599923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.972 #22 NEW cov: 11970 ft: 13741 corp: 10/740b lim: 105 exec/s: 0 rss: 69Mb L: 57/104 MS: 1 InsertByte- 00:08:14.230 [2024-04-25 20:55:29.640232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2449473536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.640259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.640306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.640321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.640376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.640391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.640450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.640469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.230 #23 NEW cov: 11970 ft: 13830 corp: 11/833b lim: 105 exec/s: 0 rss: 69Mb L: 93/104 MS: 1 ChangeBinInt- 00:08:14.230 [2024-04-25 20:55:29.680324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.680352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.680400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6944656590844747872 len:24673 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.680416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.680472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6944656592455360608 len:24673 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.680487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.680541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:16 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.680557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.230 #24 NEW cov: 11970 ft: 13860 corp: 12/922b lim: 105 exec/s: 0 rss: 69Mb L: 89/104 MS: 1 InsertRepeatedBytes- 00:08:14.230 [2024-04-25 20:55:29.720439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.720466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.720515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.230 [2024-04-25 20:55:29.720530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.230 [2024-04-25 20:55:29.720584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.720600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.720657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.720673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.231 #25 NEW cov: 11970 ft: 13888 corp: 13/1019b lim: 105 exec/s: 0 rss: 69Mb L: 97/104 MS: 1 InsertRepeatedBytes- 00:08:14.231 [2024-04-25 20:55:29.760542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.760569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.760609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.760625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.760681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.760696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.760754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.760770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.231 #26 NEW cov: 11970 ft: 13900 corp: 14/1116b lim: 105 exec/s: 0 rss: 69Mb L: 97/104 MS: 1 ChangeByte- 00:08:14.231 [2024-04-25 20:55:29.800415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.800442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.800494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.800510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.231 #27 NEW cov: 11970 ft: 13928 corp: 15/1168b lim: 105 exec/s: 0 rss: 69Mb L: 52/104 MS: 1 EraseBytes- 00:08:14.231 [2024-04-25 20:55:29.840764] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14902075602529865422 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.840791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.840840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.840855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.840912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.840928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.841007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446743863256154111 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.841023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.231 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.231 #28 NEW cov: 11993 ft: 13976 corp: 16/1266b lim: 105 exec/s: 0 rss: 69Mb L: 98/104 MS: 1 InsertRepeatedBytes- 00:08:14.231 [2024-04-25 20:55:29.890951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.890979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.891036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.891052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.891107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.891123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.231 [2024-04-25 20:55:29.891183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:3250700737 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.231 [2024-04-25 20:55:29.891202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.490 #34 NEW cov: 11993 ft: 13991 corp: 17/1367b lim: 105 exec/s: 0 rss: 69Mb L: 101/104 MS: 1 InsertRepeatedBytes- 00:08:14.490 [2024-04-25 20:55:29.931034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:14902075602529865422 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.490 [2024-04-25 20:55:29.931061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.490 [2024-04-25 20:55:29.931108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.490 [2024-04-25 20:55:29.931123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.490 [2024-04-25 20:55:29.931176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.490 [2024-04-25 20:55:29.931192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.490 [2024-04-25 20:55:29.931249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446743863256154111 len:52943 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.490 [2024-04-25 20:55:29.931265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.490 #35 NEW cov: 11993 ft: 14014 corp: 18/1465b lim: 105 exec/s: 0 rss: 70Mb L: 98/104 MS: 1 ChangeBinInt- 00:08:14.491 [2024-04-25 20:55:29.970920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:29.970948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:29.971007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:4097 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:29.971023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.491 #36 NEW cov: 11993 ft: 14026 corp: 19/1521b lim: 105 exec/s: 36 rss: 70Mb L: 56/104 MS: 1 CopyPart- 00:08:14.491 [2024-04-25 20:55:30.011284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.011312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.011355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.011370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.011427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:26369 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.011442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.011504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1152921504606846976 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.011520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.491 #37 NEW cov: 11993 ft: 14116 corp: 20/1615b lim: 105 exec/s: 37 rss: 70Mb L: 94/104 MS: 1 CrossOver- 00:08:14.491 [2024-04-25 20:55:30.051383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.051414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.051451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.051467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.051523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.051539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.051596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.051612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.491 #38 NEW cov: 11993 ft: 14166 corp: 21/1712b lim: 105 exec/s: 38 rss: 70Mb L: 97/104 MS: 1 ChangeBit- 00:08:14.491 [2024-04-25 20:55:30.091265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.091295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 [2024-04-25 20:55:30.091343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.091360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.491 #39 NEW cov: 11993 ft: 14191 corp: 22/1757b lim: 105 exec/s: 39 rss: 70Mb L: 45/104 MS: 1 EraseBytes- 00:08:14.491 [2024-04-25 20:55:30.131231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.491 [2024-04-25 20:55:30.131259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.750 #42 NEW cov: 11993 ft: 14661 corp: 23/1796b lim: 105 exec/s: 42 rss: 70Mb L: 39/104 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:14.750 [2024-04-25 20:55:30.171731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.171758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.750 [2024-04-25 20:55:30.171801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:654311424 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.171818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.750 [2024-04-25 20:55:30.171874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.171891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.750 [2024-04-25 20:55:30.171951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.171967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.750 #43 NEW cov: 11993 ft: 14673 corp: 24/1890b lim: 105 exec/s: 43 rss: 70Mb L: 94/104 MS: 1 ChangeByte- 00:08:14.750 [2024-04-25 20:55:30.211453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4485090715960753726 len:15935 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.211483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.750 #44 NEW cov: 11993 ft: 14687 corp: 25/1912b lim: 105 exec/s: 44 rss: 70Mb L: 22/104 MS: 1 InsertRepeatedBytes- 00:08:14.750 [2024-04-25 20:55:30.251772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.251799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.750 [2024-04-25 20:55:30.251856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.251873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.750 #45 NEW cov: 11993 ft: 14734 corp: 26/1969b lim: 105 exec/s: 45 rss: 70Mb L: 57/104 MS: 1 ChangeBit- 00:08:14.750 [2024-04-25 20:55:30.302077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.302104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.750 [2024-04-25 20:55:30.302145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.302161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.750 [2024-04-25 20:55:30.302218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16 len:513 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.750 [2024-04-25 20:55:30.302235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.750 #46 NEW cov: 11993 ft: 15004 corp: 27/2037b lim: 105 exec/s: 46 rss: 70Mb L: 68/104 MS: 1 CopyPart- 00:08:14.750 [2024-04-25 20:55:30.352074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.751 [2024-04-25 20:55:30.352101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.751 [2024-04-25 20:55:30.352174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1152921504606846976 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.751 [2024-04-25 20:55:30.352190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.751 #47 NEW cov: 11993 ft: 15016 corp: 28/2085b lim: 105 exec/s: 47 rss: 70Mb L: 48/104 MS: 1 EraseBytes- 00:08:14.751 [2024-04-25 20:55:30.392459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.751 [2024-04-25 20:55:30.392486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.751 [2024-04-25 20:55:30.392539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.751 [2024-04-25 20:55:30.392556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.751 [2024-04-25 20:55:30.392609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.751 [2024-04-25 20:55:30.392625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.751 [2024-04-25 20:55:30.392681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:3250700737 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.751 [2024-04-25 20:55:30.392699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.010 #48 NEW cov: 11993 ft: 15060 corp: 29/2188b lim: 105 exec/s: 48 rss: 70Mb L: 103/104 MS: 1 CrossOver- 00:08:15.010 [2024-04-25 20:55:30.432545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2449473536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.432572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.432621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:17 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.432636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.432691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.432707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.432768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:420906795008 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.432784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.010 #49 NEW cov: 11993 ft: 15070 corp: 30/2282b lim: 105 exec/s: 49 rss: 70Mb L: 94/104 MS: 1 InsertByte- 00:08:15.010 [2024-04-25 20:55:30.472587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.472613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.472658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:65536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.472673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.472730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.472745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.010 #50 NEW cov: 11993 ft: 15090 corp: 31/2346b lim: 105 exec/s: 50 rss: 70Mb L: 64/104 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:15.010 [2024-04-25 20:55:30.512750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:230698027843584 len:9985 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.512777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.512825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.512841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.512897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.512913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.010 [2024-04-25 20:55:30.512969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.010 [2024-04-25 20:55:30.512985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.010 #51 NEW cov: 11993 ft: 15096 corp: 32/2443b lim: 105 exec/s: 51 rss: 70Mb L: 97/104 MS: 1 InsertRepeatedBytes- 00:08:15.011 [2024-04-25 20:55:30.552865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.552892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.552943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.552977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.553038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.553055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.553114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.553129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.011 #57 NEW cov: 11993 ft: 15115 corp: 33/2540b lim: 105 exec/s: 57 rss: 70Mb L: 97/104 MS: 1 CopyPart- 00:08:15.011 [2024-04-25 20:55:30.593059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2449473536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.593086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.593135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.593150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.593202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.593219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.593273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.593288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.011 #58 NEW cov: 11993 ft: 15133 corp: 34/2633b lim: 105 exec/s: 58 rss: 70Mb L: 93/104 MS: 1 ChangeBinInt- 00:08:15.011 [2024-04-25 20:55:30.633169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.633196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.633246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.633262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.633318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.633332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.011 [2024-04-25 20:55:30.633387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.011 [2024-04-25 20:55:30.633405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.011 #59 NEW cov: 11993 ft: 15151 corp: 35/2730b lim: 105 exec/s: 59 rss: 70Mb L: 97/104 MS: 1 ChangeBinInt- 00:08:15.270 [2024-04-25 20:55:30.673158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:148478361600 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.270 [2024-04-25 20:55:30.673189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.270 [2024-04-25 20:55:30.673229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:65536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.270 [2024-04-25 20:55:30.673246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.270 [2024-04-25 20:55:30.673300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.270 [2024-04-25 20:55:30.673316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.270 #60 NEW cov: 11993 ft: 15160 corp: 36/2794b lim: 105 exec/s: 60 rss: 70Mb L: 64/104 MS: 1 ChangeBinInt- 00:08:15.270 [2024-04-25 20:55:30.723170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.270 [2024-04-25 20:55:30.723196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.270 [2024-04-25 20:55:30.723235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.270 [2024-04-25 20:55:30.723251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.270 #61 NEW cov: 11993 ft: 15208 corp: 37/2845b lim: 105 exec/s: 61 rss: 70Mb L: 51/104 MS: 1 EraseBytes- 00:08:15.270 [2024-04-25 20:55:30.763645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:65281 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.270 [2024-04-25 20:55:30.763671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.270 [2024-04-25 20:55:30.763728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.763743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.763797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.763813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.763866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:13961440319825297857 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.763882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.763935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.763950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.271 #62 NEW cov: 11993 ft: 15256 corp: 38/2950b lim: 105 exec/s: 62 rss: 70Mb L: 105/105 MS: 1 CopyPart- 00:08:15.271 [2024-04-25 20:55:30.813421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:934457376768 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.813451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.813499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.813514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.271 #63 NEW cov: 11993 ft: 15294 corp: 39/3002b lim: 105 exec/s: 63 rss: 70Mb L: 52/105 MS: 1 ChangeBinInt- 00:08:15.271 [2024-04-25 20:55:30.863766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2449473536 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.863793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.863843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.863860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.863916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.863932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.863988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:13907115649332789697 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.864008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.271 #64 NEW cov: 11993 ft: 15322 corp: 40/3106b lim: 105 exec/s: 64 rss: 70Mb L: 104/105 MS: 1 CopyPart- 00:08:15.271 [2024-04-25 20:55:30.903775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.903802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.903845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:65536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.903860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.271 [2024-04-25 20:55:30.903917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.271 [2024-04-25 20:55:30.903933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.530 [2024-04-25 20:55:30.943872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.530 [2024-04-25 20:55:30.943898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.530 [2024-04-25 20:55:30.943935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:65536 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.530 [2024-04-25 20:55:30.943950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.530 [2024-04-25 20:55:30.944010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1060856922112 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.530 [2024-04-25 20:55:30.944026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.530 #66 NEW cov: 11993 ft: 15341 corp: 41/3170b lim: 105 exec/s: 66 rss: 70Mb L: 64/105 MS: 2 PersAutoDict-ChangeBinInt- DE: "\001\000\000\000\000\000\000\000"- 00:08:15.530 [2024-04-25 20:55:30.983880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:169953198080 len:2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.530 [2024-04-25 20:55:30.983907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.530 [2024-04-25 20:55:30.983965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:4097 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.530 [2024-04-25 20:55:30.983982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.530 #67 NEW cov: 11993 ft: 15366 corp: 42/3226b lim: 105 exec/s: 33 rss: 70Mb L: 56/105 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:15.530 #67 DONE cov: 11993 ft: 15366 corp: 42/3226b lim: 105 exec/s: 33 rss: 70Mb 00:08:15.530 ###### Recommended dictionary. ###### 00:08:15.530 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:15.530 ###### End of recommended dictionary. ###### 00:08:15.530 Done 67 runs in 2 second(s) 00:08:15.530 20:55:31 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:15.530 20:55:31 -- ../common.sh@72 -- # (( i++ )) 00:08:15.530 20:55:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.530 20:55:31 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:15.530 20:55:31 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:15.530 20:55:31 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.530 20:55:31 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.530 20:55:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.530 20:55:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:15.530 20:55:31 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:15.530 20:55:31 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:15.530 20:55:31 -- nvmf/run.sh@34 -- # printf %02d 17 00:08:15.530 20:55:31 -- nvmf/run.sh@34 -- # port=4417 00:08:15.530 20:55:31 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.530 20:55:31 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:15.530 20:55:31 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.530 20:55:31 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.530 20:55:31 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:15.531 20:55:31 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:15.531 [2024-04-25 20:55:31.154718] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:15.531 [2024-04-25 20:55:31.154801] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199658 ] 00:08:15.531 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.789 [2024-04-25 20:55:31.298031] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:15.789 [2024-04-25 20:55:31.335945] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.789 [2024-04-25 20:55:31.356399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.789 [2024-04-25 20:55:31.408492] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.789 [2024-04-25 20:55:31.424798] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:15.789 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.789 INFO: Seed: 1580218209 00:08:16.048 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:16.048 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:16.048 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.048 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.048 #2 INITED exec/s: 0 rss: 61Mb 00:08:16.048 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.048 This may also happen if the target rejected all inputs we tried so far 00:08:16.048 [2024-04-25 20:55:31.500806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.048 [2024-04-25 20:55:31.500845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.048 [2024-04-25 20:55:31.500967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.048 [2024-04-25 20:55:31.500988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.307 NEW_FUNC[1/672]: 0x4bdaa0 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:16.307 NEW_FUNC[2/672]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.307 #28 NEW cov: 11770 ft: 11771 corp: 2/64b lim: 120 exec/s: 0 rss: 68Mb L: 63/63 MS: 1 InsertRepeatedBytes- 00:08:16.307 [2024-04-25 20:55:31.831844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854144 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.831896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.307 [2024-04-25 20:55:31.832032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.832061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.307 #34 NEW cov: 11900 ft: 12313 corp: 3/135b lim: 120 exec/s: 0 rss: 68Mb L: 71/71 MS: 1 CMP- DE: "\020\000\000\000\000\000\000\000"- 00:08:16.307 [2024-04-25 20:55:31.881720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.881748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.307 [2024-04-25 20:55:31.881872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.881895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.307 #35 NEW cov: 11906 ft: 12693 corp: 4/198b lim: 120 exec/s: 0 rss: 68Mb L: 63/71 MS: 1 ChangeBit- 00:08:16.307 [2024-04-25 20:55:31.922060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.922093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.307 [2024-04-25 20:55:31.922222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:4097 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.922244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.307 #36 NEW cov: 11991 ft: 12915 corp: 5/261b lim: 120 exec/s: 0 rss: 68Mb L: 63/71 MS: 1 PersAutoDict- DE: "\020\000\000\000\000\000\000\000"- 00:08:16.307 [2024-04-25 20:55:31.961983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664861690988 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.307 [2024-04-25 20:55:31.962028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 #37 NEW cov: 11991 ft: 13838 corp: 6/305b lim: 120 exec/s: 0 rss: 68Mb L: 44/71 MS: 1 CrossOver- 00:08:16.567 [2024-04-25 20:55:32.002241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.002272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 [2024-04-25 20:55:32.002402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.002425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.567 #38 NEW cov: 11991 ft: 14049 corp: 7/368b lim: 120 exec/s: 0 rss: 69Mb L: 63/71 MS: 1 CopyPart- 00:08:16.567 [2024-04-25 20:55:32.042384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854144 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.042418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 [2024-04-25 20:55:32.042531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.042555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.567 #39 NEW cov: 11991 ft: 14165 corp: 8/439b lim: 120 exec/s: 0 rss: 69Mb L: 71/71 MS: 1 ChangeBit- 00:08:16.567 [2024-04-25 20:55:32.082510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854144 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.082544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 [2024-04-25 20:55:32.082657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.082679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.567 #40 NEW cov: 11991 ft: 14215 corp: 9/510b lim: 120 exec/s: 0 rss: 69Mb L: 71/71 MS: 1 ChangeBit- 00:08:16.567 [2024-04-25 20:55:32.122128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.122163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 #42 NEW cov: 11991 ft: 14263 corp: 10/546b lim: 120 exec/s: 0 rss: 69Mb L: 36/71 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:16.567 [2024-04-25 20:55:32.162574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738665322318444 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.162606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 [2024-04-25 20:55:32.162719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.162741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.567 [2024-04-25 20:55:32.162861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15360106557709511788 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.162884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.567 #47 NEW cov: 11991 ft: 14660 corp: 11/620b lim: 120 exec/s: 0 rss: 69Mb L: 74/74 MS: 5 InsertByte-InsertByte-CrossOver-CMP-CrossOver- DE: "\325*\017\210\251\177\000\000"- 00:08:16.567 [2024-04-25 20:55:32.203050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738665322318444 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.567 [2024-04-25 20:55:32.203086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.567 [2024-04-25 20:55:32.203180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.568 [2024-04-25 20:55:32.203200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.568 [2024-04-25 20:55:32.203322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15360106557172640876 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.568 [2024-04-25 20:55:32.203344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.827 #48 NEW cov: 11991 ft: 14690 corp: 12/694b lim: 120 exec/s: 0 rss: 69Mb L: 74/74 MS: 1 ChangeBit- 00:08:16.827 [2024-04-25 20:55:32.242613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.242646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.242759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.242782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.827 #49 NEW cov: 11991 ft: 14697 corp: 13/765b lim: 120 exec/s: 0 rss: 69Mb L: 71/74 MS: 1 PersAutoDict- DE: "\020\000\000\000\000\000\000\000"- 00:08:16.827 [2024-04-25 20:55:32.283015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.283048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.283148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.283173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.827 #50 NEW cov: 11991 ft: 14711 corp: 14/836b lim: 120 exec/s: 0 rss: 70Mb L: 71/74 MS: 1 ChangeBit- 00:08:16.827 [2024-04-25 20:55:32.323076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.323106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.323177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.323196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.827 #51 NEW cov: 11991 ft: 14772 corp: 15/899b lim: 120 exec/s: 0 rss: 70Mb L: 63/74 MS: 1 CopyPart- 00:08:16.827 [2024-04-25 20:55:32.363239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.363272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.363390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.363410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.827 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.827 #52 NEW cov: 12014 ft: 14812 corp: 16/962b lim: 120 exec/s: 0 rss: 70Mb L: 63/74 MS: 1 ChangeByte- 00:08:16.827 [2024-04-25 20:55:32.403455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854144 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.403487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.403600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.403623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.827 #53 NEW cov: 12014 ft: 14823 corp: 17/1033b lim: 120 exec/s: 0 rss: 70Mb L: 71/74 MS: 1 ShuffleBytes- 00:08:16.827 [2024-04-25 20:55:32.443721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738665322318444 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.443753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.443875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27501 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.443896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.827 [2024-04-25 20:55:32.444019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15360106557172640876 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.827 [2024-04-25 20:55:32.444039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.827 #59 NEW cov: 12014 ft: 14871 corp: 18/1107b lim: 120 exec/s: 59 rss: 70Mb L: 74/74 MS: 1 ChangeBinInt- 00:08:17.086 [2024-04-25 20:55:32.493444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4278910976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.086 [2024-04-25 20:55:32.493474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.087 #63 NEW cov: 12014 ft: 14887 corp: 19/1145b lim: 120 exec/s: 63 rss: 70Mb L: 38/74 MS: 4 CopyPart-CopyPart-CMP-CrossOver- DE: "\377\013"- 00:08:17.087 [2024-04-25 20:55:32.533122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7794724266352209004 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.533149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.087 #64 NEW cov: 12014 ft: 14952 corp: 20/1189b lim: 120 exec/s: 64 rss: 70Mb L: 44/74 MS: 1 ChangeBit- 00:08:17.087 [2024-04-25 20:55:32.594231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4281916557463849580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.594260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.087 [2024-04-25 20:55:32.594345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27756 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.594368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.087 [2024-04-25 20:55:32.594491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7842220571865410668 len:32513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.594511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.087 #65 NEW cov: 12014 ft: 14978 corp: 21/1264b lim: 120 exec/s: 65 rss: 70Mb L: 75/75 MS: 1 InsertByte- 00:08:17.087 [2024-04-25 20:55:32.643434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:9154248407001991717 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.643463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.087 #68 NEW cov: 12014 ft: 15028 corp: 22/1300b lim: 120 exec/s: 68 rss: 70Mb L: 36/75 MS: 3 CopyPart-InsertByte-CrossOver- 00:08:17.087 [2024-04-25 20:55:32.684010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4281916557463849580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.684041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.087 [2024-04-25 20:55:32.684123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512277100 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.684144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.087 [2024-04-25 20:55:32.684265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7812853831765347436 len:43392 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.684289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.087 #69 NEW cov: 12014 ft: 15050 corp: 23/1376b lim: 120 exec/s: 69 rss: 70Mb L: 76/76 MS: 1 InsertByte- 00:08:17.087 [2024-04-25 20:55:32.744406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854144 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.744437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.087 [2024-04-25 20:55:32.744538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.087 [2024-04-25 20:55:32.744557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.346 #70 NEW cov: 12014 ft: 15138 corp: 24/1447b lim: 120 exec/s: 70 rss: 70Mb L: 71/76 MS: 1 ChangeByte- 00:08:17.346 [2024-04-25 20:55:32.794611] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.794639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.346 [2024-04-25 20:55:32.794754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:2669 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.794774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.346 #71 NEW cov: 12014 ft: 15166 corp: 25/1510b lim: 120 exec/s: 71 rss: 70Mb L: 63/76 MS: 1 CrossOver- 00:08:17.346 [2024-04-25 20:55:32.834511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4281916557463849580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.834548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.346 [2024-04-25 20:55:32.834644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512277100 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.834667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.346 [2024-04-25 20:55:32.834789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7812853831765347436 len:43392 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.834815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.346 #72 NEW cov: 12014 ft: 15183 corp: 26/1586b lim: 120 exec/s: 72 rss: 70Mb L: 76/76 MS: 1 ChangeBinInt- 00:08:17.346 [2024-04-25 20:55:32.885107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854144 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.885142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.346 [2024-04-25 20:55:32.885244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.885267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.346 [2024-04-25 20:55:32.885388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.885410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.346 #73 NEW cov: 12014 ft: 15211 corp: 27/1665b lim: 120 exec/s: 73 rss: 70Mb L: 79/79 MS: 1 PersAutoDict- DE: "\325*\017\210\251\177\000\000"- 00:08:17.346 [2024-04-25 20:55:32.934358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4278910976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.934390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.346 #74 NEW cov: 12014 ft: 15240 corp: 28/1703b lim: 120 exec/s: 74 rss: 70Mb L: 38/79 MS: 1 CrossOver- 00:08:17.346 [2024-04-25 20:55:32.975068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.975096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.346 [2024-04-25 20:55:32.975212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.346 [2024-04-25 20:55:32.975232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.346 #75 NEW cov: 12014 ft: 15247 corp: 29/1766b lim: 120 exec/s: 75 rss: 70Mb L: 63/79 MS: 1 ChangeBit- 00:08:17.619 [2024-04-25 20:55:33.015697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738665322318444 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.015730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.015829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.015851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.015972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7812756258698325100 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.015997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.016112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:30518510416035840 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.016132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.619 #76 NEW cov: 12014 ft: 15592 corp: 30/1873b lim: 120 exec/s: 76 rss: 70Mb L: 107/107 MS: 1 CrossOver- 00:08:17.619 [2024-04-25 20:55:33.054855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4281916557463849580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.054885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.054980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:268435456 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.055008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.619 #77 NEW cov: 12014 ft: 15625 corp: 31/1935b lim: 120 exec/s: 77 rss: 70Mb L: 62/107 MS: 1 EraseBytes- 00:08:17.619 [2024-04-25 20:55:33.095602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4281916557463849580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.095635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.095721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812684790442516076 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.095744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.095861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6803932349981289580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.095885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.619 #78 NEW cov: 12014 ft: 15634 corp: 32/2011b lim: 120 exec/s: 78 rss: 70Mb L: 76/107 MS: 1 CopyPart- 00:08:17.619 [2024-04-25 20:55:33.134894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.134923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 #79 NEW cov: 12014 ft: 15640 corp: 33/2053b lim: 120 exec/s: 79 rss: 70Mb L: 42/107 MS: 1 EraseBytes- 00:08:17.619 [2024-04-25 20:55:33.175557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113449 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.175589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.175695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.175720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.619 #80 NEW cov: 12014 ft: 15661 corp: 34/2116b lim: 120 exec/s: 80 rss: 70Mb L: 63/107 MS: 1 ChangeByte- 00:08:17.619 [2024-04-25 20:55:33.215249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.215282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.215397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.215421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.619 #81 NEW cov: 12014 ft: 15678 corp: 35/2179b lim: 120 exec/s: 81 rss: 70Mb L: 63/107 MS: 1 ShuffleBytes- 00:08:17.619 [2024-04-25 20:55:33.256215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4278910976 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.256244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.256349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:720575940379279360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.256369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.619 [2024-04-25 20:55:33.256491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.619 [2024-04-25 20:55:33.256515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.878 #82 NEW cov: 12014 ft: 15694 corp: 36/2251b lim: 120 exec/s: 82 rss: 70Mb L: 72/107 MS: 1 CopyPart- 00:08:17.878 [2024-04-25 20:55:33.305947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4281916557463849580 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.305980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.306092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1812987904 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.306117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.878 #83 NEW cov: 12014 ft: 15729 corp: 37/2314b lim: 120 exec/s: 83 rss: 70Mb L: 63/107 MS: 1 EraseBytes- 00:08:17.878 [2024-04-25 20:55:33.346077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.346111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.346222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.346244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.878 #84 NEW cov: 12014 ft: 15736 corp: 38/2385b lim: 120 exec/s: 84 rss: 70Mb L: 71/107 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\006"- 00:08:17.878 [2024-04-25 20:55:33.386540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:174854151 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.386568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.386665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.386688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.386814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.386837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.878 #85 NEW cov: 12014 ft: 15743 corp: 39/2464b lim: 120 exec/s: 85 rss: 71Mb L: 79/107 MS: 1 ChangeBinInt- 00:08:17.878 [2024-04-25 20:55:33.436626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2531906048913253155 len:8996 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.436658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.436777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2531906049332683555 len:8996 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.436800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.436919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2531906049332683555 len:8996 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.436943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.878 #89 NEW cov: 12014 ft: 15781 corp: 40/2546b lim: 120 exec/s: 89 rss: 71Mb L: 82/107 MS: 4 CopyPart-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:17.878 [2024-04-25 20:55:33.476484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7812738664868113516 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.476515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.878 [2024-04-25 20:55:33.476642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:27757 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.878 [2024-04-25 20:55:33.476663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.878 #90 NEW cov: 12014 ft: 15789 corp: 41/2609b lim: 120 exec/s: 45 rss: 71Mb L: 63/107 MS: 1 ChangeBinInt- 00:08:17.878 #90 DONE cov: 12014 ft: 15789 corp: 41/2609b lim: 120 exec/s: 45 rss: 71Mb 00:08:17.878 ###### Recommended dictionary. ###### 00:08:17.878 "\020\000\000\000\000\000\000\000" # Uses: 2 00:08:17.878 "\325*\017\210\251\177\000\000" # Uses: 1 00:08:17.878 "\377\013" # Uses: 0 00:08:17.878 "\001\000\000\000\000\000\000\006" # Uses: 0 00:08:17.878 ###### End of recommended dictionary. ###### 00:08:17.878 Done 90 runs in 2 second(s) 00:08:18.138 20:55:33 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.138 20:55:33 -- ../common.sh@72 -- # (( i++ )) 00:08:18.138 20:55:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.138 20:55:33 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:18.138 20:55:33 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:18.138 20:55:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:18.138 20:55:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.138 20:55:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.138 20:55:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:18.138 20:55:33 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.138 20:55:33 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.138 20:55:33 -- nvmf/run.sh@34 -- # printf %02d 18 00:08:18.138 20:55:33 -- nvmf/run.sh@34 -- # port=4418 00:08:18.138 20:55:33 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.138 20:55:33 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:18.138 20:55:33 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.138 20:55:33 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.138 20:55:33 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.138 20:55:33 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:18.138 [2024-04-25 20:55:33.648801] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:18.138 [2024-04-25 20:55:33.648872] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200148 ] 00:08:18.138 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.138 [2024-04-25 20:55:33.785869] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:18.397 [2024-04-25 20:55:33.823203] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.397 [2024-04-25 20:55:33.842280] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.397 [2024-04-25 20:55:33.894360] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.397 [2024-04-25 20:55:33.910647] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:18.397 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.398 INFO: Seed: 4065193733 00:08:18.398 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:18.398 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:18.398 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.398 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.398 #2 INITED exec/s: 0 rss: 60Mb 00:08:18.398 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.398 This may also happen if the target rejected all inputs we tried so far 00:08:18.398 [2024-04-25 20:55:33.955854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.398 [2024-04-25 20:55:33.955882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.398 [2024-04-25 20:55:33.955916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.398 [2024-04-25 20:55:33.955930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.398 [2024-04-25 20:55:33.955981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.398 [2024-04-25 20:55:33.956008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.657 NEW_FUNC[1/670]: 0x4c1390 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:18.657 NEW_FUNC[2/670]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.657 #4 NEW cov: 11713 ft: 11714 corp: 2/75b lim: 100 exec/s: 0 rss: 68Mb L: 74/74 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:18.657 [2024-04-25 20:55:34.276604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.657 [2024-04-25 20:55:34.276635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.657 [2024-04-25 20:55:34.276694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.657 [2024-04-25 20:55:34.276708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.657 #5 NEW cov: 11843 ft: 12606 corp: 3/116b lim: 100 exec/s: 0 rss: 68Mb L: 41/74 MS: 1 EraseBytes- 00:08:18.916 [2024-04-25 20:55:34.326696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.916 [2024-04-25 20:55:34.326724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.916 [2024-04-25 20:55:34.326780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.916 [2024-04-25 20:55:34.326794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.916 #6 NEW cov: 11849 ft: 12805 corp: 4/157b lim: 100 exec/s: 0 rss: 68Mb L: 41/74 MS: 1 CopyPart- 00:08:18.916 [2024-04-25 20:55:34.366780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.916 [2024-04-25 20:55:34.366805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.916 [2024-04-25 20:55:34.366858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.916 [2024-04-25 20:55:34.366873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.916 #7 NEW cov: 11934 ft: 13074 corp: 5/198b lim: 100 exec/s: 0 rss: 68Mb L: 41/74 MS: 1 CMP- DE: "\001\037"- 00:08:18.916 [2024-04-25 20:55:34.406952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.916 [2024-04-25 20:55:34.406977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.916 [2024-04-25 20:55:34.407052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.916 [2024-04-25 20:55:34.407068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.916 #13 NEW cov: 11934 ft: 13142 corp: 6/239b lim: 100 exec/s: 0 rss: 68Mb L: 41/74 MS: 1 ShuffleBytes- 00:08:18.916 [2024-04-25 20:55:34.447009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.916 [2024-04-25 20:55:34.447034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.916 [2024-04-25 20:55:34.447086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.916 [2024-04-25 20:55:34.447100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.916 #14 NEW cov: 11934 ft: 13260 corp: 7/280b lim: 100 exec/s: 0 rss: 68Mb L: 41/74 MS: 1 ChangeBinInt- 00:08:18.916 [2024-04-25 20:55:34.487267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.916 [2024-04-25 20:55:34.487291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.916 [2024-04-25 20:55:34.487332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.916 [2024-04-25 20:55:34.487346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.917 [2024-04-25 20:55:34.487400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.917 [2024-04-25 20:55:34.487414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.917 #15 NEW cov: 11934 ft: 13318 corp: 8/351b lim: 100 exec/s: 0 rss: 68Mb L: 71/74 MS: 1 CopyPart- 00:08:18.917 [2024-04-25 20:55:34.527247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.917 [2024-04-25 20:55:34.527272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.917 [2024-04-25 20:55:34.527309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.917 [2024-04-25 20:55:34.527326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.917 #16 NEW cov: 11934 ft: 13383 corp: 9/407b lim: 100 exec/s: 0 rss: 69Mb L: 56/74 MS: 1 InsertRepeatedBytes- 00:08:18.917 [2024-04-25 20:55:34.567469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.917 [2024-04-25 20:55:34.567494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.917 [2024-04-25 20:55:34.567539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.917 [2024-04-25 20:55:34.567553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.917 [2024-04-25 20:55:34.567607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.917 [2024-04-25 20:55:34.567622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.176 #17 NEW cov: 11934 ft: 13420 corp: 10/486b lim: 100 exec/s: 0 rss: 69Mb L: 79/79 MS: 1 CrossOver- 00:08:19.176 [2024-04-25 20:55:34.607463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.176 [2024-04-25 20:55:34.607488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.176 [2024-04-25 20:55:34.607542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.176 [2024-04-25 20:55:34.607557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.176 #18 NEW cov: 11934 ft: 13480 corp: 11/529b lim: 100 exec/s: 0 rss: 69Mb L: 43/79 MS: 1 PersAutoDict- DE: "\001\037"- 00:08:19.176 [2024-04-25 20:55:34.647612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.176 [2024-04-25 20:55:34.647636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.176 [2024-04-25 20:55:34.647673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.176 [2024-04-25 20:55:34.647688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.176 #19 NEW cov: 11934 ft: 13620 corp: 12/572b lim: 100 exec/s: 0 rss: 69Mb L: 43/79 MS: 1 PersAutoDict- DE: "\001\037"- 00:08:19.176 [2024-04-25 20:55:34.687664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.176 [2024-04-25 20:55:34.687689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.176 [2024-04-25 20:55:34.687725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.176 [2024-04-25 20:55:34.687740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.176 #24 NEW cov: 11934 ft: 13643 corp: 13/616b lim: 100 exec/s: 0 rss: 69Mb L: 44/79 MS: 5 ChangeByte-ShuffleBytes-CopyPart-CopyPart-CrossOver- 00:08:19.176 [2024-04-25 20:55:34.727708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.176 [2024-04-25 20:55:34.727734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.176 #25 NEW cov: 11934 ft: 14037 corp: 14/643b lim: 100 exec/s: 0 rss: 69Mb L: 27/79 MS: 1 EraseBytes- 00:08:19.176 [2024-04-25 20:55:34.767934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.176 [2024-04-25 20:55:34.767959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.176 [2024-04-25 20:55:34.768017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.176 [2024-04-25 20:55:34.768035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.176 #26 NEW cov: 11934 ft: 14091 corp: 15/684b lim: 100 exec/s: 0 rss: 69Mb L: 41/79 MS: 1 EraseBytes- 00:08:19.176 [2024-04-25 20:55:34.808078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.176 [2024-04-25 20:55:34.808103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.176 [2024-04-25 20:55:34.808154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.176 [2024-04-25 20:55:34.808168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.176 #27 NEW cov: 11934 ft: 14115 corp: 16/732b lim: 100 exec/s: 0 rss: 69Mb L: 48/79 MS: 1 CopyPart- 00:08:19.435 [2024-04-25 20:55:34.848273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.435 [2024-04-25 20:55:34.848297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.435 [2024-04-25 20:55:34.848332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.436 [2024-04-25 20:55:34.848346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.436 [2024-04-25 20:55:34.848400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.436 [2024-04-25 20:55:34.848414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.436 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.436 #28 NEW cov: 11957 ft: 14154 corp: 17/803b lim: 100 exec/s: 0 rss: 69Mb L: 71/79 MS: 1 ChangeBinInt- 00:08:19.436 [2024-04-25 20:55:34.898165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.436 [2024-04-25 20:55:34.898190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.436 #29 NEW cov: 11957 ft: 14186 corp: 18/830b lim: 100 exec/s: 0 rss: 69Mb L: 27/79 MS: 1 ChangeBit- 00:08:19.436 [2024-04-25 20:55:34.938443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.436 [2024-04-25 20:55:34.938469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.436 [2024-04-25 20:55:34.938522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.436 [2024-04-25 20:55:34.938535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.436 #30 NEW cov: 11957 ft: 14210 corp: 19/885b lim: 100 exec/s: 30 rss: 69Mb L: 55/79 MS: 1 CrossOver- 00:08:19.436 [2024-04-25 20:55:34.978774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.436 [2024-04-25 20:55:34.978799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.436 [2024-04-25 20:55:34.978845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.436 [2024-04-25 20:55:34.978859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.436 [2024-04-25 20:55:34.978913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.436 [2024-04-25 20:55:34.978927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.436 [2024-04-25 20:55:34.978981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.436 [2024-04-25 20:55:34.979002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.436 #32 NEW cov: 11957 ft: 14522 corp: 20/974b lim: 100 exec/s: 32 rss: 69Mb L: 89/89 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:19.436 [2024-04-25 20:55:35.018518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.436 [2024-04-25 20:55:35.018543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.436 #33 NEW cov: 11957 ft: 14541 corp: 21/1001b lim: 100 exec/s: 33 rss: 70Mb L: 27/89 MS: 1 ChangeBit- 00:08:19.436 [2024-04-25 20:55:35.058777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.436 [2024-04-25 20:55:35.058801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.436 [2024-04-25 20:55:35.058855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.436 [2024-04-25 20:55:35.058869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.436 #34 NEW cov: 11957 ft: 14556 corp: 22/1042b lim: 100 exec/s: 34 rss: 70Mb L: 41/89 MS: 1 PersAutoDict- DE: "\001\037"- 00:08:19.695 [2024-04-25 20:55:35.098763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.695 [2024-04-25 20:55:35.098789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.695 #35 NEW cov: 11957 ft: 14566 corp: 23/1069b lim: 100 exec/s: 35 rss: 70Mb L: 27/89 MS: 1 EraseBytes- 00:08:19.695 [2024-04-25 20:55:35.139012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.695 [2024-04-25 20:55:35.139037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.695 [2024-04-25 20:55:35.139079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.695 [2024-04-25 20:55:35.139094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.695 #36 NEW cov: 11957 ft: 14593 corp: 24/1110b lim: 100 exec/s: 36 rss: 70Mb L: 41/89 MS: 1 ChangeBinInt- 00:08:19.695 [2024-04-25 20:55:35.179109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.695 [2024-04-25 20:55:35.179135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.695 [2024-04-25 20:55:35.179172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.695 [2024-04-25 20:55:35.179187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.695 #37 NEW cov: 11957 ft: 14611 corp: 25/1153b lim: 100 exec/s: 37 rss: 70Mb L: 43/89 MS: 1 ShuffleBytes- 00:08:19.695 [2024-04-25 20:55:35.219100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.695 [2024-04-25 20:55:35.219125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.695 #38 NEW cov: 11957 ft: 14629 corp: 26/1180b lim: 100 exec/s: 38 rss: 70Mb L: 27/89 MS: 1 ChangeBit- 00:08:19.695 [2024-04-25 20:55:35.259220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.696 [2024-04-25 20:55:35.259245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.696 #39 NEW cov: 11957 ft: 14673 corp: 27/1214b lim: 100 exec/s: 39 rss: 70Mb L: 34/89 MS: 1 CrossOver- 00:08:19.696 [2024-04-25 20:55:35.299526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.696 [2024-04-25 20:55:35.299550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.696 [2024-04-25 20:55:35.299599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.696 [2024-04-25 20:55:35.299615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.696 [2024-04-25 20:55:35.299670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.696 [2024-04-25 20:55:35.299685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.696 #40 NEW cov: 11957 ft: 14729 corp: 28/1293b lim: 100 exec/s: 40 rss: 70Mb L: 79/89 MS: 1 CopyPart- 00:08:19.696 [2024-04-25 20:55:35.339695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.696 [2024-04-25 20:55:35.339720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.696 [2024-04-25 20:55:35.339755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.696 [2024-04-25 20:55:35.339768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.696 [2024-04-25 20:55:35.339823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.696 [2024-04-25 20:55:35.339837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.955 #41 NEW cov: 11957 ft: 14751 corp: 29/1371b lim: 100 exec/s: 41 rss: 70Mb L: 78/89 MS: 1 CopyPart- 00:08:19.955 [2024-04-25 20:55:35.379805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.955 [2024-04-25 20:55:35.379830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.379864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.955 [2024-04-25 20:55:35.379878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.379933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.955 [2024-04-25 20:55:35.379947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.955 #42 NEW cov: 11957 ft: 14763 corp: 30/1445b lim: 100 exec/s: 42 rss: 70Mb L: 74/89 MS: 1 ChangeBinInt- 00:08:19.955 [2024-04-25 20:55:35.419746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.955 [2024-04-25 20:55:35.419770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.419809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.955 [2024-04-25 20:55:35.419822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.955 #43 NEW cov: 11957 ft: 14773 corp: 31/1501b lim: 100 exec/s: 43 rss: 70Mb L: 56/89 MS: 1 ChangeBinInt- 00:08:19.955 [2024-04-25 20:55:35.459829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.955 [2024-04-25 20:55:35.459854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.955 #44 NEW cov: 11957 ft: 14791 corp: 32/1528b lim: 100 exec/s: 44 rss: 70Mb L: 27/89 MS: 1 ChangeByte- 00:08:19.955 [2024-04-25 20:55:35.500247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.955 [2024-04-25 20:55:35.500273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.500313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.955 [2024-04-25 20:55:35.500326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.500380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.955 [2024-04-25 20:55:35.500394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.500450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.955 [2024-04-25 20:55:35.500463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.955 #45 NEW cov: 11957 ft: 14804 corp: 33/1609b lim: 100 exec/s: 45 rss: 70Mb L: 81/89 MS: 1 CopyPart- 00:08:19.955 [2024-04-25 20:55:35.540122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.955 [2024-04-25 20:55:35.540148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.540185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.955 [2024-04-25 20:55:35.540199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.955 #46 NEW cov: 11957 ft: 14820 corp: 34/1650b lim: 100 exec/s: 46 rss: 70Mb L: 41/89 MS: 1 ChangeBinInt- 00:08:19.955 [2024-04-25 20:55:35.580482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.955 [2024-04-25 20:55:35.580509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.580551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.955 [2024-04-25 20:55:35.580566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.580620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.955 [2024-04-25 20:55:35.580634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.955 [2024-04-25 20:55:35.580690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.955 [2024-04-25 20:55:35.580705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.955 #47 NEW cov: 11957 ft: 14848 corp: 35/1731b lim: 100 exec/s: 47 rss: 70Mb L: 81/89 MS: 1 PersAutoDict- DE: "\001\037"- 00:08:20.215 [2024-04-25 20:55:35.620347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.215 [2024-04-25 20:55:35.620373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.215 [2024-04-25 20:55:35.620413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.215 [2024-04-25 20:55:35.620428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.215 #48 NEW cov: 11957 ft: 14870 corp: 36/1787b lim: 100 exec/s: 48 rss: 70Mb L: 56/89 MS: 1 CrossOver- 00:08:20.215 [2024-04-25 20:55:35.660339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.216 [2024-04-25 20:55:35.660364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.216 #49 NEW cov: 11957 ft: 14902 corp: 37/1824b lim: 100 exec/s: 49 rss: 70Mb L: 37/89 MS: 1 CopyPart- 00:08:20.216 [2024-04-25 20:55:35.700730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.216 [2024-04-25 20:55:35.700757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.700808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.216 [2024-04-25 20:55:35.700821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.700876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.216 [2024-04-25 20:55:35.700890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.216 #50 NEW cov: 11957 ft: 14912 corp: 38/1902b lim: 100 exec/s: 50 rss: 70Mb L: 78/89 MS: 1 CopyPart- 00:08:20.216 [2024-04-25 20:55:35.740927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.216 [2024-04-25 20:55:35.740951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.741002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.216 [2024-04-25 20:55:35.741017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.741074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.216 [2024-04-25 20:55:35.741088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.741144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.216 [2024-04-25 20:55:35.741158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.216 #51 NEW cov: 11957 ft: 14936 corp: 39/1988b lim: 100 exec/s: 51 rss: 70Mb L: 86/89 MS: 1 InsertRepeatedBytes- 00:08:20.216 [2024-04-25 20:55:35.780942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.216 [2024-04-25 20:55:35.780968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.781019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.216 [2024-04-25 20:55:35.781034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.216 [2024-04-25 20:55:35.781091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.216 [2024-04-25 20:55:35.781106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.216 #52 NEW cov: 11957 ft: 14952 corp: 40/2063b lim: 100 exec/s: 52 rss: 70Mb L: 75/89 MS: 1 InsertByte- 00:08:20.216 [2024-04-25 20:55:35.820868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.216 [2024-04-25 20:55:35.820893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.216 #53 NEW cov: 11957 ft: 14962 corp: 41/2090b lim: 100 exec/s: 53 rss: 70Mb L: 27/89 MS: 1 ChangeBinInt- 00:08:20.216 [2024-04-25 20:55:35.860948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.216 [2024-04-25 20:55:35.860974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.476 #54 NEW cov: 11957 ft: 14972 corp: 42/2118b lim: 100 exec/s: 54 rss: 70Mb L: 28/89 MS: 1 InsertByte- 00:08:20.476 [2024-04-25 20:55:35.901067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.476 [2024-04-25 20:55:35.901093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.476 #55 NEW cov: 11957 ft: 14988 corp: 43/2145b lim: 100 exec/s: 55 rss: 70Mb L: 27/89 MS: 1 ShuffleBytes- 00:08:20.476 [2024-04-25 20:55:35.941381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.476 [2024-04-25 20:55:35.941406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.476 [2024-04-25 20:55:35.941453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.476 [2024-04-25 20:55:35.941468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.476 [2024-04-25 20:55:35.941525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.476 [2024-04-25 20:55:35.941539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.476 #56 NEW cov: 11957 ft: 15050 corp: 44/2220b lim: 100 exec/s: 28 rss: 70Mb L: 75/89 MS: 1 CopyPart- 00:08:20.476 #56 DONE cov: 11957 ft: 15050 corp: 44/2220b lim: 100 exec/s: 28 rss: 70Mb 00:08:20.476 ###### Recommended dictionary. ###### 00:08:20.476 "\001\037" # Uses: 4 00:08:20.476 ###### End of recommended dictionary. ###### 00:08:20.476 Done 56 runs in 2 second(s) 00:08:20.476 20:55:36 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.476 20:55:36 -- ../common.sh@72 -- # (( i++ )) 00:08:20.476 20:55:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.476 20:55:36 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:20.476 20:55:36 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:20.476 20:55:36 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.476 20:55:36 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.476 20:55:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.476 20:55:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:20.476 20:55:36 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.476 20:55:36 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.476 20:55:36 -- nvmf/run.sh@34 -- # printf %02d 19 00:08:20.476 20:55:36 -- nvmf/run.sh@34 -- # port=4419 00:08:20.476 20:55:36 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.476 20:55:36 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:20.476 20:55:36 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.476 20:55:36 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.476 20:55:36 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.476 20:55:36 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:20.476 [2024-04-25 20:55:36.104490] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:20.476 [2024-04-25 20:55:36.104547] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200684 ] 00:08:20.476 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.735 [2024-04-25 20:55:36.240549] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:20.735 [2024-04-25 20:55:36.277883] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.735 [2024-04-25 20:55:36.297218] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.735 [2024-04-25 20:55:36.349237] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.735 [2024-04-25 20:55:36.365565] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:20.735 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.735 INFO: Seed: 2225235438 00:08:20.994 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:20.994 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:20.994 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.994 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.994 #2 INITED exec/s: 0 rss: 60Mb 00:08:20.994 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.994 This may also happen if the target rejected all inputs we tried so far 00:08:20.994 [2024-04-25 20:55:36.431292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:20.994 [2024-04-25 20:55:36.431333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.254 NEW_FUNC[1/669]: 0x4c4350 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:21.254 NEW_FUNC[2/669]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.254 #7 NEW cov: 11681 ft: 11690 corp: 2/15b lim: 50 exec/s: 0 rss: 68Mb L: 14/14 MS: 5 ChangeBit-ChangeBinInt-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:21.254 [2024-04-25 20:55:36.752298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1229782938129862929 len:4370 00:08:21.254 [2024-04-25 20:55:36.752341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.254 NEW_FUNC[1/1]: 0x1d50460 in msg_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:808 00:08:21.254 #8 NEW cov: 11821 ft: 12213 corp: 3/25b lim: 50 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 InsertRepeatedBytes- 00:08:21.254 [2024-04-25 20:55:36.802361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.254 [2024-04-25 20:55:36.802389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.254 #14 NEW cov: 11827 ft: 12352 corp: 4/40b lim: 50 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 InsertByte- 00:08:21.254 [2024-04-25 20:55:36.852623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:3841 00:08:21.254 [2024-04-25 20:55:36.852652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.254 #15 NEW cov: 11912 ft: 12557 corp: 5/55b lim: 50 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 ChangeBinInt- 00:08:21.254 [2024-04-25 20:55:36.902863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100664064 len:1 00:08:21.254 [2024-04-25 20:55:36.902893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.514 #16 NEW cov: 11912 ft: 12611 corp: 6/71b lim: 50 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 InsertByte- 00:08:21.514 [2024-04-25 20:55:36.952999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663369 len:1 00:08:21.514 [2024-04-25 20:55:36.953028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.514 #18 NEW cov: 11912 ft: 12645 corp: 7/81b lim: 50 exec/s: 0 rss: 68Mb L: 10/16 MS: 2 EraseBytes-InsertByte- 00:08:21.514 [2024-04-25 20:55:37.003186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100690761 len:1 00:08:21.514 [2024-04-25 20:55:37.003213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.514 #19 NEW cov: 11912 ft: 12802 corp: 8/91b lim: 50 exec/s: 0 rss: 69Mb L: 10/16 MS: 1 ChangeByte- 00:08:21.514 [2024-04-25 20:55:37.063290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.514 [2024-04-25 20:55:37.063326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.514 #20 NEW cov: 11912 ft: 12843 corp: 9/106b lim: 50 exec/s: 0 rss: 69Mb L: 15/16 MS: 1 ChangeBinInt- 00:08:21.514 [2024-04-25 20:55:37.113804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.514 [2024-04-25 20:55:37.113842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.514 #21 NEW cov: 11921 ft: 12910 corp: 10/121b lim: 50 exec/s: 0 rss: 69Mb L: 15/16 MS: 1 ChangeBinInt- 00:08:21.514 [2024-04-25 20:55:37.163556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7957419010510712430 len:28161 00:08:21.514 [2024-04-25 20:55:37.163584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.774 #23 NEW cov: 11921 ft: 13020 corp: 11/131b lim: 50 exec/s: 0 rss: 69Mb L: 10/16 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:21.774 [2024-04-25 20:55:37.213844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.774 [2024-04-25 20:55:37.213873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.774 #24 NEW cov: 11921 ft: 13120 corp: 12/142b lim: 50 exec/s: 0 rss: 69Mb L: 11/16 MS: 1 EraseBytes- 00:08:21.774 [2024-04-25 20:55:37.274100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:3841 00:08:21.774 [2024-04-25 20:55:37.274129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.774 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.774 #30 NEW cov: 11944 ft: 13204 corp: 13/157b lim: 50 exec/s: 0 rss: 69Mb L: 15/16 MS: 1 ChangeBinInt- 00:08:21.774 [2024-04-25 20:55:37.334395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.774 [2024-04-25 20:55:37.334432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.334574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:184683593728 len:1 00:08:21.774 [2024-04-25 20:55:37.334600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.774 #36 NEW cov: 11944 ft: 13527 corp: 14/186b lim: 50 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:08:21.774 [2024-04-25 20:55:37.385009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.774 [2024-04-25 20:55:37.385047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.385131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:721420288 len:1 00:08:21.774 [2024-04-25 20:55:37.385154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.385285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:21.774 [2024-04-25 20:55:37.385311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.385448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:21.774 [2024-04-25 20:55:37.385473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.774 #37 NEW cov: 11944 ft: 13868 corp: 15/228b lim: 50 exec/s: 37 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:08:21.774 [2024-04-25 20:55:37.435218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:21.774 [2024-04-25 20:55:37.435252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.435367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:721420288 len:1 00:08:21.774 [2024-04-25 20:55:37.435390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.435536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:20993 00:08:21.774 [2024-04-25 20:55:37.435563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.774 [2024-04-25 20:55:37.435709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:21.774 [2024-04-25 20:55:37.435738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.033 #38 NEW cov: 11944 ft: 13944 corp: 16/271b lim: 50 exec/s: 38 rss: 69Mb L: 43/43 MS: 1 InsertByte- 00:08:22.033 [2024-04-25 20:55:37.494860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7957419010510712464 len:28161 00:08:22.033 [2024-04-25 20:55:37.494888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.033 #39 NEW cov: 11944 ft: 13992 corp: 17/281b lim: 50 exec/s: 39 rss: 69Mb L: 10/43 MS: 1 ChangeBinInt- 00:08:22.033 [2024-04-25 20:55:37.555016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2305843009314357321 len:1 00:08:22.033 [2024-04-25 20:55:37.555044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.033 #40 NEW cov: 11944 ft: 14007 corp: 18/291b lim: 50 exec/s: 40 rss: 69Mb L: 10/43 MS: 1 ChangeBit- 00:08:22.033 [2024-04-25 20:55:37.605174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:3841 00:08:22.033 [2024-04-25 20:55:37.605202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.033 #41 NEW cov: 11944 ft: 14039 corp: 19/306b lim: 50 exec/s: 41 rss: 69Mb L: 15/43 MS: 1 ShuffleBytes- 00:08:22.033 [2024-04-25 20:55:37.665362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:6597170430720 len:1 00:08:22.033 [2024-04-25 20:55:37.665394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.033 #42 NEW cov: 11944 ft: 14045 corp: 20/322b lim: 50 exec/s: 42 rss: 70Mb L: 16/43 MS: 1 ChangeBinInt- 00:08:22.292 [2024-04-25 20:55:37.715461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16492775079936 len:11009 00:08:22.292 [2024-04-25 20:55:37.715488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.292 #43 NEW cov: 11944 ft: 14082 corp: 21/335b lim: 50 exec/s: 43 rss: 70Mb L: 13/43 MS: 1 EraseBytes- 00:08:22.292 [2024-04-25 20:55:37.766119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7998392934090042990 len:65536 00:08:22.292 [2024-04-25 20:55:37.766153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.292 [2024-04-25 20:55:37.766295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.292 [2024-04-25 20:55:37.766325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.292 [2024-04-25 20:55:37.766469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.292 [2024-04-25 20:55:37.766494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.292 [2024-04-25 20:55:37.766627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:28271 00:08:22.292 [2024-04-25 20:55:37.766656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.292 #49 NEW cov: 11944 ft: 14094 corp: 22/378b lim: 50 exec/s: 49 rss: 70Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:08:22.292 [2024-04-25 20:55:37.815818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7957419156539600528 len:28271 00:08:22.292 [2024-04-25 20:55:37.815846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.292 #50 NEW cov: 11944 ft: 14138 corp: 23/391b lim: 50 exec/s: 50 rss: 70Mb L: 13/43 MS: 1 CopyPart- 00:08:22.292 [2024-04-25 20:55:37.865920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:25870467840 len:1 00:08:22.292 [2024-04-25 20:55:37.865947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.292 #51 NEW cov: 11944 ft: 14162 corp: 24/408b lim: 50 exec/s: 51 rss: 70Mb L: 17/43 MS: 1 CopyPart- 00:08:22.292 [2024-04-25 20:55:37.916101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7966989159644528640 len:28271 00:08:22.292 [2024-04-25 20:55:37.916128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.293 #55 NEW cov: 11944 ft: 14168 corp: 25/423b lim: 50 exec/s: 55 rss: 70Mb L: 15/43 MS: 4 EraseBytes-CrossOver-ShuffleBytes-CrossOver- 00:08:22.552 [2024-04-25 20:55:37.966248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:107151360 len:1 00:08:22.552 [2024-04-25 20:55:37.966279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.552 #56 NEW cov: 11944 ft: 14175 corp: 26/439b lim: 50 exec/s: 56 rss: 70Mb L: 16/43 MS: 1 InsertByte- 00:08:22.552 [2024-04-25 20:55:38.016549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100690761 len:1 00:08:22.552 [2024-04-25 20:55:38.016585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.552 [2024-04-25 20:55:38.016717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.552 [2024-04-25 20:55:38.016743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.552 #57 NEW cov: 11944 ft: 14197 corp: 27/466b lim: 50 exec/s: 57 rss: 70Mb L: 27/43 MS: 1 InsertRepeatedBytes- 00:08:22.552 [2024-04-25 20:55:38.066547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10407377209739865710 len:28271 00:08:22.552 [2024-04-25 20:55:38.066574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.552 #58 NEW cov: 11944 ft: 14215 corp: 28/477b lim: 50 exec/s: 58 rss: 70Mb L: 11/43 MS: 1 InsertByte- 00:08:22.552 [2024-04-25 20:55:38.116900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1688849960954697 len:18721 00:08:22.552 [2024-04-25 20:55:38.116928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.552 #61 NEW cov: 11944 ft: 14218 corp: 29/494b lim: 50 exec/s: 61 rss: 70Mb L: 17/43 MS: 3 EraseBytes-CopyPart-CrossOver- 00:08:22.552 [2024-04-25 20:55:38.166998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100690761 len:1 00:08:22.552 [2024-04-25 20:55:38.167035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.552 [2024-04-25 20:55:38.167183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.552 [2024-04-25 20:55:38.167207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.552 #62 NEW cov: 11944 ft: 14229 corp: 30/517b lim: 50 exec/s: 62 rss: 70Mb L: 23/43 MS: 1 EraseBytes- 00:08:22.811 [2024-04-25 20:55:38.227075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100859904 len:1 00:08:22.811 [2024-04-25 20:55:38.227103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.811 #63 NEW cov: 11944 ft: 14239 corp: 31/533b lim: 50 exec/s: 63 rss: 70Mb L: 16/43 MS: 1 ShuffleBytes- 00:08:22.811 [2024-04-25 20:55:38.277228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:100663296 len:1 00:08:22.811 [2024-04-25 20:55:38.277254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.811 #64 NEW cov: 11944 ft: 14326 corp: 32/545b lim: 50 exec/s: 64 rss: 70Mb L: 12/43 MS: 1 EraseBytes- 00:08:22.811 [2024-04-25 20:55:38.327464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7966989159644528640 len:28271 00:08:22.811 [2024-04-25 20:55:38.327491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.811 #65 NEW cov: 11944 ft: 14356 corp: 33/560b lim: 50 exec/s: 65 rss: 70Mb L: 15/43 MS: 1 ShuffleBytes- 00:08:22.811 [2024-04-25 20:55:38.377643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2305843009314365513 len:1 00:08:22.811 [2024-04-25 20:55:38.377671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.811 #66 NEW cov: 11944 ft: 14368 corp: 34/570b lim: 50 exec/s: 66 rss: 70Mb L: 10/43 MS: 1 ChangeBit- 00:08:22.811 [2024-04-25 20:55:38.427785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:863976447552342042 len:1 00:08:22.811 [2024-04-25 20:55:38.427811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.811 #67 NEW cov: 11944 ft: 14374 corp: 35/580b lim: 50 exec/s: 33 rss: 70Mb L: 10/43 MS: 1 CMP- DE: "b\251H\032\013\375v\000"- 00:08:22.811 #67 DONE cov: 11944 ft: 14374 corp: 35/580b lim: 50 exec/s: 33 rss: 70Mb 00:08:22.811 ###### Recommended dictionary. ###### 00:08:22.811 "b\251H\032\013\375v\000" # Uses: 0 00:08:22.811 ###### End of recommended dictionary. ###### 00:08:22.811 Done 67 runs in 2 second(s) 00:08:23.071 20:55:38 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.071 20:55:38 -- ../common.sh@72 -- # (( i++ )) 00:08:23.071 20:55:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.071 20:55:38 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:23.071 20:55:38 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:23.071 20:55:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.071 20:55:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.071 20:55:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.071 20:55:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:23.071 20:55:38 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.071 20:55:38 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.071 20:55:38 -- nvmf/run.sh@34 -- # printf %02d 20 00:08:23.071 20:55:38 -- nvmf/run.sh@34 -- # port=4420 00:08:23.071 20:55:38 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.071 20:55:38 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:23.071 20:55:38 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.071 20:55:38 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.071 20:55:38 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.071 20:55:38 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:23.071 [2024-04-25 20:55:38.577539] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:23.071 [2024-04-25 20:55:38.577604] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200970 ] 00:08:23.071 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.071 [2024-04-25 20:55:38.718493] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:23.330 [2024-04-25 20:55:38.756189] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.330 [2024-04-25 20:55:38.775411] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.330 [2024-04-25 20:55:38.827480] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.330 [2024-04-25 20:55:38.843754] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:23.330 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.330 INFO: Seed: 408275441 00:08:23.330 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:23.330 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:23.330 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.330 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.330 #2 INITED exec/s: 0 rss: 61Mb 00:08:23.330 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.330 This may also happen if the target rejected all inputs we tried so far 00:08:23.330 [2024-04-25 20:55:38.910583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.330 [2024-04-25 20:55:38.910617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.330 [2024-04-25 20:55:38.910763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.330 [2024-04-25 20:55:38.910792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.330 [2024-04-25 20:55:38.910923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.330 [2024-04-25 20:55:38.910946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.330 [2024-04-25 20:55:38.911092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.330 [2024-04-25 20:55:38.911118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.589 NEW_FUNC[1/672]: 0x4c5f10 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:23.589 NEW_FUNC[2/672]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.589 #9 NEW cov: 11749 ft: 11750 corp: 2/83b lim: 90 exec/s: 0 rss: 68Mb L: 82/82 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:23.589 [2024-04-25 20:55:39.250676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.589 [2024-04-25 20:55:39.250719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.589 [2024-04-25 20:55:39.250834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.589 [2024-04-25 20:55:39.250857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.848 #21 NEW cov: 11879 ft: 12670 corp: 3/127b lim: 90 exec/s: 0 rss: 68Mb L: 44/82 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:23.848 [2024-04-25 20:55:39.290551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.848 [2024-04-25 20:55:39.290584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.290704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.848 [2024-04-25 20:55:39.290724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.848 #22 NEW cov: 11885 ft: 12917 corp: 4/177b lim: 90 exec/s: 0 rss: 68Mb L: 50/82 MS: 1 CopyPart- 00:08:23.848 [2024-04-25 20:55:39.340950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.848 [2024-04-25 20:55:39.340986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.341097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.848 [2024-04-25 20:55:39.341120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.341252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.848 [2024-04-25 20:55:39.341274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.848 #23 NEW cov: 11970 ft: 13431 corp: 5/238b lim: 90 exec/s: 0 rss: 69Mb L: 61/82 MS: 1 InsertRepeatedBytes- 00:08:23.848 [2024-04-25 20:55:39.391191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.848 [2024-04-25 20:55:39.391220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.391338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.848 [2024-04-25 20:55:39.391361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.391481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.848 [2024-04-25 20:55:39.391503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.848 #24 NEW cov: 11970 ft: 13541 corp: 6/296b lim: 90 exec/s: 0 rss: 69Mb L: 58/82 MS: 1 InsertRepeatedBytes- 00:08:23.848 [2024-04-25 20:55:39.431493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.848 [2024-04-25 20:55:39.431523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.431616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.848 [2024-04-25 20:55:39.431634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.431753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.848 [2024-04-25 20:55:39.431776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.431898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.848 [2024-04-25 20:55:39.431919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.848 #25 NEW cov: 11970 ft: 13658 corp: 7/379b lim: 90 exec/s: 0 rss: 69Mb L: 83/83 MS: 1 InsertByte- 00:08:23.848 [2024-04-25 20:55:39.481397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.848 [2024-04-25 20:55:39.481425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.481520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.848 [2024-04-25 20:55:39.481545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.848 [2024-04-25 20:55:39.481669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.848 [2024-04-25 20:55:39.481693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.848 #26 NEW cov: 11970 ft: 13756 corp: 8/440b lim: 90 exec/s: 0 rss: 69Mb L: 61/83 MS: 1 ChangeBit- 00:08:24.107 [2024-04-25 20:55:39.531768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.107 [2024-04-25 20:55:39.531798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.531876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.107 [2024-04-25 20:55:39.531896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.532010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.107 [2024-04-25 20:55:39.532029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.532151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.107 [2024-04-25 20:55:39.532173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.107 #27 NEW cov: 11970 ft: 13798 corp: 9/529b lim: 90 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 CrossOver- 00:08:24.107 [2024-04-25 20:55:39.571854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.107 [2024-04-25 20:55:39.571886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.571982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.107 [2024-04-25 20:55:39.572006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.572123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.107 [2024-04-25 20:55:39.572148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.572269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.107 [2024-04-25 20:55:39.572289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.107 #28 NEW cov: 11970 ft: 13883 corp: 10/608b lim: 90 exec/s: 0 rss: 69Mb L: 79/89 MS: 1 InsertRepeatedBytes- 00:08:24.107 [2024-04-25 20:55:39.611573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.107 [2024-04-25 20:55:39.611600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.611727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.107 [2024-04-25 20:55:39.611747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.107 #29 NEW cov: 11970 ft: 13983 corp: 11/651b lim: 90 exec/s: 0 rss: 69Mb L: 43/89 MS: 1 CrossOver- 00:08:24.107 [2024-04-25 20:55:39.651925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.107 [2024-04-25 20:55:39.651952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.652083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.107 [2024-04-25 20:55:39.652107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.107 [2024-04-25 20:55:39.652224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.107 [2024-04-25 20:55:39.652247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.107 #30 NEW cov: 11970 ft: 14009 corp: 12/712b lim: 90 exec/s: 0 rss: 69Mb L: 61/89 MS: 1 ChangeBit- 00:08:24.107 [2024-04-25 20:55:39.692012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.108 [2024-04-25 20:55:39.692041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.108 [2024-04-25 20:55:39.692136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.108 [2024-04-25 20:55:39.692155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.108 [2024-04-25 20:55:39.692272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.108 [2024-04-25 20:55:39.692290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.108 #31 NEW cov: 11970 ft: 14048 corp: 13/773b lim: 90 exec/s: 0 rss: 69Mb L: 61/89 MS: 1 ChangeByte- 00:08:24.108 [2024-04-25 20:55:39.732437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.108 [2024-04-25 20:55:39.732467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.108 [2024-04-25 20:55:39.732556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.108 [2024-04-25 20:55:39.732577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.108 [2024-04-25 20:55:39.732690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.108 [2024-04-25 20:55:39.732714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.108 [2024-04-25 20:55:39.732829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.108 [2024-04-25 20:55:39.732851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.108 #32 NEW cov: 11970 ft: 14061 corp: 14/852b lim: 90 exec/s: 0 rss: 70Mb L: 79/89 MS: 1 ChangeBit- 00:08:24.367 [2024-04-25 20:55:39.782123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:39.782158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.782266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:39.782287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.367 #33 NEW cov: 11993 ft: 14097 corp: 15/896b lim: 90 exec/s: 0 rss: 70Mb L: 44/89 MS: 1 CopyPart- 00:08:24.367 [2024-04-25 20:55:39.822503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:39.822531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.822642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:39.822659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.822785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.367 [2024-04-25 20:55:39.822813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.367 #36 NEW cov: 11993 ft: 14105 corp: 16/962b lim: 90 exec/s: 0 rss: 70Mb L: 66/89 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:24.367 [2024-04-25 20:55:39.862489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:39.862520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.862614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:39.862634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.862748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.367 [2024-04-25 20:55:39.862771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.367 #37 NEW cov: 11993 ft: 14117 corp: 17/1020b lim: 90 exec/s: 37 rss: 70Mb L: 58/89 MS: 1 ChangeByte- 00:08:24.367 [2024-04-25 20:55:39.902667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:39.902695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.902783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:39.902808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.902922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.367 [2024-04-25 20:55:39.902943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.367 #38 NEW cov: 11993 ft: 14189 corp: 18/1081b lim: 90 exec/s: 38 rss: 70Mb L: 61/89 MS: 1 CMP- DE: "\377\007"- 00:08:24.367 [2024-04-25 20:55:39.942443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:39.942471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.942588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:39.942610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 #39 NEW cov: 11993 ft: 14198 corp: 19/1126b lim: 90 exec/s: 39 rss: 70Mb L: 45/89 MS: 1 InsertByte- 00:08:24.367 [2024-04-25 20:55:39.982756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:39.982788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.982895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:39.982915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:39.983031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.367 [2024-04-25 20:55:39.983057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.367 #40 NEW cov: 11993 ft: 14208 corp: 20/1186b lim: 90 exec/s: 40 rss: 70Mb L: 60/89 MS: 1 EraseBytes- 00:08:24.367 [2024-04-25 20:55:40.022943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.367 [2024-04-25 20:55:40.022974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:40.023084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.367 [2024-04-25 20:55:40.023105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.367 [2024-04-25 20:55:40.023231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.367 [2024-04-25 20:55:40.023254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.626 #41 NEW cov: 11993 ft: 14228 corp: 21/1252b lim: 90 exec/s: 41 rss: 70Mb L: 66/89 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:24.626 [2024-04-25 20:55:40.073620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-04-25 20:55:40.073652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-04-25 20:55:40.073735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-04-25 20:55:40.073757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.626 [2024-04-25 20:55:40.073884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.626 [2024-04-25 20:55:40.073906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.626 [2024-04-25 20:55:40.074024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.626 [2024-04-25 20:55:40.074047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.626 [2024-04-25 20:55:40.074169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:24.626 [2024-04-25 20:55:40.074188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.626 #42 NEW cov: 11993 ft: 14278 corp: 22/1342b lim: 90 exec/s: 42 rss: 70Mb L: 90/90 MS: 1 InsertByte- 00:08:24.626 [2024-04-25 20:55:40.123015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-04-25 20:55:40.123053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-04-25 20:55:40.123170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-04-25 20:55:40.123190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.626 #43 NEW cov: 11993 ft: 14351 corp: 23/1386b lim: 90 exec/s: 43 rss: 70Mb L: 44/90 MS: 1 InsertByte- 00:08:24.626 [2024-04-25 20:55:40.173735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.626 [2024-04-25 20:55:40.173765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.626 [2024-04-25 20:55:40.173843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.626 [2024-04-25 20:55:40.173866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.173981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.627 [2024-04-25 20:55:40.174007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.174130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.627 [2024-04-25 20:55:40.174151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.627 #44 NEW cov: 11993 ft: 14394 corp: 24/1474b lim: 90 exec/s: 44 rss: 70Mb L: 88/90 MS: 1 CrossOver- 00:08:24.627 [2024-04-25 20:55:40.213801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.627 [2024-04-25 20:55:40.213832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.213908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.627 [2024-04-25 20:55:40.213931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.214051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.627 [2024-04-25 20:55:40.214074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.214194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.627 [2024-04-25 20:55:40.214220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.627 #45 NEW cov: 11993 ft: 14407 corp: 25/1553b lim: 90 exec/s: 45 rss: 70Mb L: 79/90 MS: 1 ChangeByte- 00:08:24.627 [2024-04-25 20:55:40.253698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.627 [2024-04-25 20:55:40.253727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.253838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.627 [2024-04-25 20:55:40.253864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.627 [2024-04-25 20:55:40.253997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.627 [2024-04-25 20:55:40.254036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.627 #46 NEW cov: 11993 ft: 14412 corp: 26/1619b lim: 90 exec/s: 46 rss: 70Mb L: 66/90 MS: 1 ChangeBit- 00:08:24.886 [2024-04-25 20:55:40.303419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.886 [2024-04-25 20:55:40.303447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.886 #47 NEW cov: 11993 ft: 15215 corp: 27/1649b lim: 90 exec/s: 47 rss: 70Mb L: 30/90 MS: 1 EraseBytes- 00:08:24.886 [2024-04-25 20:55:40.353750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.886 [2024-04-25 20:55:40.353782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.353899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.886 [2024-04-25 20:55:40.353918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.886 #48 NEW cov: 11993 ft: 15241 corp: 28/1694b lim: 90 exec/s: 48 rss: 70Mb L: 45/90 MS: 1 ChangeByte- 00:08:24.886 [2024-04-25 20:55:40.404338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.886 [2024-04-25 20:55:40.404370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.404459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.886 [2024-04-25 20:55:40.404478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.404599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.886 [2024-04-25 20:55:40.404622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.404738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.886 [2024-04-25 20:55:40.404760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.886 #49 NEW cov: 11993 ft: 15286 corp: 29/1779b lim: 90 exec/s: 49 rss: 70Mb L: 85/90 MS: 1 InsertRepeatedBytes- 00:08:24.886 [2024-04-25 20:55:40.444280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.886 [2024-04-25 20:55:40.444311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.444429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.886 [2024-04-25 20:55:40.444453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.444580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.886 [2024-04-25 20:55:40.444600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.886 #50 NEW cov: 11993 ft: 15306 corp: 30/1847b lim: 90 exec/s: 50 rss: 70Mb L: 68/90 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\017"- 00:08:24.886 [2024-04-25 20:55:40.494650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.886 [2024-04-25 20:55:40.494681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.494811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.886 [2024-04-25 20:55:40.494836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.494964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.886 [2024-04-25 20:55:40.494997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.495121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.886 [2024-04-25 20:55:40.495145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.886 #51 NEW cov: 11993 ft: 15337 corp: 31/1934b lim: 90 exec/s: 51 rss: 70Mb L: 87/90 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\017"- 00:08:24.886 [2024-04-25 20:55:40.544857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.886 [2024-04-25 20:55:40.544888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.544965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.886 [2024-04-25 20:55:40.544998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.545116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.886 [2024-04-25 20:55:40.545144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.886 [2024-04-25 20:55:40.545261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.886 [2024-04-25 20:55:40.545283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.145 #52 NEW cov: 11993 ft: 15352 corp: 32/2018b lim: 90 exec/s: 52 rss: 70Mb L: 84/90 MS: 1 InsertByte- 00:08:25.145 [2024-04-25 20:55:40.594933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.145 [2024-04-25 20:55:40.594965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.595069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.145 [2024-04-25 20:55:40.595095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.595209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.145 [2024-04-25 20:55:40.595233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.595354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.145 [2024-04-25 20:55:40.595377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.145 #53 NEW cov: 11993 ft: 15354 corp: 33/2103b lim: 90 exec/s: 53 rss: 70Mb L: 85/90 MS: 1 ShuffleBytes- 00:08:25.145 [2024-04-25 20:55:40.645142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.145 [2024-04-25 20:55:40.645176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.645292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.145 [2024-04-25 20:55:40.645319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.645442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.145 [2024-04-25 20:55:40.645465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.645587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.145 [2024-04-25 20:55:40.645614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.145 #54 NEW cov: 11993 ft: 15363 corp: 34/2186b lim: 90 exec/s: 54 rss: 71Mb L: 83/90 MS: 1 ShuffleBytes- 00:08:25.145 [2024-04-25 20:55:40.685420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.145 [2024-04-25 20:55:40.685449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.685570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.145 [2024-04-25 20:55:40.685594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.685715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.145 [2024-04-25 20:55:40.685740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.145 [2024-04-25 20:55:40.685864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.145 [2024-04-25 20:55:40.685888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.686014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:25.146 [2024-04-25 20:55:40.686036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:25.146 #55 NEW cov: 11993 ft: 15377 corp: 35/2276b lim: 90 exec/s: 55 rss: 71Mb L: 90/90 MS: 1 CrossOver- 00:08:25.146 [2024-04-25 20:55:40.725306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.146 [2024-04-25 20:55:40.725333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.725430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.146 [2024-04-25 20:55:40.725455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.725577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.146 [2024-04-25 20:55:40.725600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.725727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.146 [2024-04-25 20:55:40.725747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.146 #56 NEW cov: 11993 ft: 15399 corp: 36/2364b lim: 90 exec/s: 56 rss: 71Mb L: 88/90 MS: 1 ChangeByte- 00:08:25.146 [2024-04-25 20:55:40.765365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.146 [2024-04-25 20:55:40.765395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.765491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.146 [2024-04-25 20:55:40.765513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.765638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.146 [2024-04-25 20:55:40.765658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.146 [2024-04-25 20:55:40.765780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.146 [2024-04-25 20:55:40.765805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.146 #57 NEW cov: 11993 ft: 15407 corp: 37/2448b lim: 90 exec/s: 57 rss: 71Mb L: 84/90 MS: 1 InsertByte- 00:08:25.146 [2024-04-25 20:55:40.804796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.146 [2024-04-25 20:55:40.804824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.405 #58 NEW cov: 11993 ft: 15426 corp: 38/2478b lim: 90 exec/s: 58 rss: 71Mb L: 30/90 MS: 1 ChangeByte- 00:08:25.405 [2024-04-25 20:55:40.845615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.405 [2024-04-25 20:55:40.845644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.405 [2024-04-25 20:55:40.845719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.405 [2024-04-25 20:55:40.845741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.405 [2024-04-25 20:55:40.845855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.405 [2024-04-25 20:55:40.845880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.405 [2024-04-25 20:55:40.846004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.405 [2024-04-25 20:55:40.846027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.405 #59 NEW cov: 11993 ft: 15448 corp: 39/2566b lim: 90 exec/s: 59 rss: 71Mb L: 88/90 MS: 1 InsertByte- 00:08:25.405 [2024-04-25 20:55:40.885789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.405 [2024-04-25 20:55:40.885819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.405 [2024-04-25 20:55:40.885881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.405 [2024-04-25 20:55:40.885900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.405 [2024-04-25 20:55:40.886020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.405 [2024-04-25 20:55:40.886055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.405 [2024-04-25 20:55:40.886175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.405 [2024-04-25 20:55:40.886194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.405 #60 NEW cov: 11993 ft: 15453 corp: 40/2650b lim: 90 exec/s: 30 rss: 71Mb L: 84/90 MS: 1 ShuffleBytes- 00:08:25.405 #60 DONE cov: 11993 ft: 15453 corp: 40/2650b lim: 90 exec/s: 30 rss: 71Mb 00:08:25.405 ###### Recommended dictionary. ###### 00:08:25.405 "\377\007" # Uses: 0 00:08:25.405 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:25.405 "\000\000\000\000\000\000\000\017" # Uses: 1 00:08:25.405 ###### End of recommended dictionary. ###### 00:08:25.405 Done 60 runs in 2 second(s) 00:08:25.405 20:55:41 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.405 20:55:41 -- ../common.sh@72 -- # (( i++ )) 00:08:25.405 20:55:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.405 20:55:41 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:25.405 20:55:41 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:25.405 20:55:41 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.405 20:55:41 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.405 20:55:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.405 20:55:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:25.405 20:55:41 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.405 20:55:41 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.405 20:55:41 -- nvmf/run.sh@34 -- # printf %02d 21 00:08:25.405 20:55:41 -- nvmf/run.sh@34 -- # port=4421 00:08:25.406 20:55:41 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.406 20:55:41 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:25.406 20:55:41 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.406 20:55:41 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.406 20:55:41 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.406 20:55:41 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:25.406 [2024-04-25 20:55:41.065956] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:25.406 [2024-04-25 20:55:41.066057] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201504 ] 00:08:25.665 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.665 [2024-04-25 20:55:41.209078] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:25.665 [2024-04-25 20:55:41.245938] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.665 [2024-04-25 20:55:41.265300] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.665 [2024-04-25 20:55:41.317363] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.924 [2024-04-25 20:55:41.333624] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:25.924 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.924 INFO: Seed: 2898265403 00:08:25.924 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:25.924 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:25.924 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.924 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.924 #2 INITED exec/s: 0 rss: 61Mb 00:08:25.924 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.924 This may also happen if the target rejected all inputs we tried so far 00:08:25.924 [2024-04-25 20:55:41.399063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-04-25 20:55:41.399093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 [2024-04-25 20:55:41.399143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.924 [2024-04-25 20:55:41.399158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.924 [2024-04-25 20:55:41.399217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.924 [2024-04-25 20:55:41.399232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.184 NEW_FUNC[1/672]: 0x4c9130 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:26.184 NEW_FUNC[2/672]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.184 #41 NEW cov: 11724 ft: 11725 corp: 2/39b lim: 50 exec/s: 0 rss: 68Mb L: 38/38 MS: 4 ShuffleBytes-CrossOver-InsertByte-InsertRepeatedBytes- 00:08:26.184 [2024-04-25 20:55:41.699969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.184 [2024-04-25 20:55:41.700052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.700162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.184 [2024-04-25 20:55:41.700200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.184 #45 NEW cov: 11854 ft: 12749 corp: 3/60b lim: 50 exec/s: 0 rss: 68Mb L: 21/38 MS: 4 InsertRepeatedBytes-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:08:26.184 [2024-04-25 20:55:41.750029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.184 [2024-04-25 20:55:41.750059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.750098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.184 [2024-04-25 20:55:41.750114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.750168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.184 [2024-04-25 20:55:41.750183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.750237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.184 [2024-04-25 20:55:41.750252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.184 #51 NEW cov: 11860 ft: 13245 corp: 4/109b lim: 50 exec/s: 0 rss: 68Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:08:26.184 [2024-04-25 20:55:41.800143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.184 [2024-04-25 20:55:41.800171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.800212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.184 [2024-04-25 20:55:41.800226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.800279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.184 [2024-04-25 20:55:41.800294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.800350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.184 [2024-04-25 20:55:41.800364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.184 #52 NEW cov: 11945 ft: 13462 corp: 5/158b lim: 50 exec/s: 0 rss: 68Mb L: 49/49 MS: 1 CopyPart- 00:08:26.184 [2024-04-25 20:55:41.840016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.184 [2024-04-25 20:55:41.840043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.184 [2024-04-25 20:55:41.840086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.184 [2024-04-25 20:55:41.840103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.442 #53 NEW cov: 11945 ft: 13560 corp: 6/179b lim: 50 exec/s: 0 rss: 69Mb L: 21/49 MS: 1 CMP- DE: "\377\003"- 00:08:26.442 [2024-04-25 20:55:41.880398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.442 [2024-04-25 20:55:41.880424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.880470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.442 [2024-04-25 20:55:41.880484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.880537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.442 [2024-04-25 20:55:41.880552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.880608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.442 [2024-04-25 20:55:41.880624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.442 #54 NEW cov: 11945 ft: 13644 corp: 7/221b lim: 50 exec/s: 0 rss: 69Mb L: 42/49 MS: 1 EraseBytes- 00:08:26.442 [2024-04-25 20:55:41.920333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.442 [2024-04-25 20:55:41.920361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.920399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.442 [2024-04-25 20:55:41.920414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.920469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.442 [2024-04-25 20:55:41.920485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.442 #55 NEW cov: 11945 ft: 13736 corp: 8/259b lim: 50 exec/s: 0 rss: 69Mb L: 38/49 MS: 1 ChangeBinInt- 00:08:26.442 [2024-04-25 20:55:41.960586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.442 [2024-04-25 20:55:41.960614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.960676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.442 [2024-04-25 20:55:41.960690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.960746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.442 [2024-04-25 20:55:41.960759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.442 [2024-04-25 20:55:41.960814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.442 [2024-04-25 20:55:41.960829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.442 #56 NEW cov: 11945 ft: 13803 corp: 9/305b lim: 50 exec/s: 0 rss: 69Mb L: 46/49 MS: 1 InsertRepeatedBytes- 00:08:26.442 [2024-04-25 20:55:42.000267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.442 [2024-04-25 20:55:42.000294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.442 #57 NEW cov: 11945 ft: 14603 corp: 10/319b lim: 50 exec/s: 0 rss: 69Mb L: 14/49 MS: 1 CrossOver- 00:08:26.442 [2024-04-25 20:55:42.040634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-04-25 20:55:42.040660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 [2024-04-25 20:55:42.040698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.443 [2024-04-25 20:55:42.040713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.443 [2024-04-25 20:55:42.040765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.443 [2024-04-25 20:55:42.040780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.443 #58 NEW cov: 11945 ft: 14664 corp: 11/357b lim: 50 exec/s: 0 rss: 69Mb L: 38/49 MS: 1 ChangeBinInt- 00:08:26.443 [2024-04-25 20:55:42.080789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-04-25 20:55:42.080816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 [2024-04-25 20:55:42.080852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.443 [2024-04-25 20:55:42.080868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.443 [2024-04-25 20:55:42.080923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.443 [2024-04-25 20:55:42.080937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.443 #59 NEW cov: 11945 ft: 14716 corp: 12/395b lim: 50 exec/s: 0 rss: 69Mb L: 38/49 MS: 1 ChangeBinInt- 00:08:26.767 [2024-04-25 20:55:42.120752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.120780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.120829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.767 [2024-04-25 20:55:42.120844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.767 #60 NEW cov: 11945 ft: 14744 corp: 13/416b lim: 50 exec/s: 0 rss: 69Mb L: 21/49 MS: 1 PersAutoDict- DE: "\377\003"- 00:08:26.767 [2024-04-25 20:55:42.160733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.160759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 #61 NEW cov: 11945 ft: 14781 corp: 14/434b lim: 50 exec/s: 0 rss: 69Mb L: 18/49 MS: 1 EraseBytes- 00:08:26.767 [2024-04-25 20:55:42.201104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.201131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.201171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.767 [2024-04-25 20:55:42.201187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.201243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.767 [2024-04-25 20:55:42.201259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.767 #62 NEW cov: 11945 ft: 14793 corp: 15/472b lim: 50 exec/s: 0 rss: 69Mb L: 38/49 MS: 1 CopyPart- 00:08:26.767 [2024-04-25 20:55:42.241304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.241332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.241370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.767 [2024-04-25 20:55:42.241385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.241440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.767 [2024-04-25 20:55:42.241455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.767 #63 NEW cov: 11945 ft: 14802 corp: 16/510b lim: 50 exec/s: 0 rss: 69Mb L: 38/49 MS: 1 ChangeBinInt- 00:08:26.767 [2024-04-25 20:55:42.281510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.281536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.281574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.767 [2024-04-25 20:55:42.281589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.281645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.767 [2024-04-25 20:55:42.281661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.281715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.767 [2024-04-25 20:55:42.281730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.767 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.767 #64 NEW cov: 11968 ft: 14840 corp: 17/550b lim: 50 exec/s: 0 rss: 69Mb L: 40/49 MS: 1 PersAutoDict- DE: "\377\003"- 00:08:26.767 [2024-04-25 20:55:42.331503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.331529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.331568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.767 [2024-04-25 20:55:42.331583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.331639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.767 [2024-04-25 20:55:42.331655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.361564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.767 [2024-04-25 20:55:42.361591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.361630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.767 [2024-04-25 20:55:42.361645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.767 [2024-04-25 20:55:42.361700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.767 [2024-04-25 20:55:42.361718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.767 #66 NEW cov: 11968 ft: 14849 corp: 18/588b lim: 50 exec/s: 66 rss: 69Mb L: 38/49 MS: 2 ChangeBinInt-PersAutoDict- DE: "\377\003"- 00:08:27.054 [2024-04-25 20:55:42.391676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.391704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.391742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.391758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.391818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.391833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 #67 NEW cov: 11968 ft: 14907 corp: 19/626b lim: 50 exec/s: 67 rss: 70Mb L: 38/49 MS: 1 ChangeByte- 00:08:27.055 [2024-04-25 20:55:42.431773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.431800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.431838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.431853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.431910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.431925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 #68 NEW cov: 11968 ft: 14932 corp: 20/664b lim: 50 exec/s: 68 rss: 70Mb L: 38/49 MS: 1 PersAutoDict- DE: "\377\003"- 00:08:27.055 [2024-04-25 20:55:42.472045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.472071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.472118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.472138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.472193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.472208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.472261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.055 [2024-04-25 20:55:42.472276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.055 #69 NEW cov: 11968 ft: 14989 corp: 21/708b lim: 50 exec/s: 69 rss: 70Mb L: 44/49 MS: 1 CopyPart- 00:08:27.055 [2024-04-25 20:55:42.512198] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.512225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.512262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.512276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.512351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.512366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.512421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.055 [2024-04-25 20:55:42.512437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.055 #70 NEW cov: 11968 ft: 15003 corp: 22/752b lim: 50 exec/s: 70 rss: 70Mb L: 44/49 MS: 1 InsertRepeatedBytes- 00:08:27.055 [2024-04-25 20:55:42.552186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.552212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.552266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.552284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.552350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.552365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 #71 NEW cov: 11968 ft: 15012 corp: 23/790b lim: 50 exec/s: 71 rss: 70Mb L: 38/49 MS: 1 ShuffleBytes- 00:08:27.055 [2024-04-25 20:55:42.592149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.592175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.592218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.592241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 #72 NEW cov: 11968 ft: 15109 corp: 24/811b lim: 50 exec/s: 72 rss: 70Mb L: 21/49 MS: 1 ChangeByte- 00:08:27.055 [2024-04-25 20:55:42.632513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.632540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.632587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.632609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.632664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.632679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.632732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.055 [2024-04-25 20:55:42.632747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.055 #73 NEW cov: 11968 ft: 15126 corp: 25/855b lim: 50 exec/s: 73 rss: 70Mb L: 44/49 MS: 1 ChangeByte- 00:08:27.055 [2024-04-25 20:55:42.672617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.672645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.672689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.672713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.672766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.672781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.672834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.055 [2024-04-25 20:55:42.672852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.055 #74 NEW cov: 11968 ft: 15131 corp: 26/899b lim: 50 exec/s: 74 rss: 70Mb L: 44/49 MS: 1 ShuffleBytes- 00:08:27.055 [2024-04-25 20:55:42.712754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.055 [2024-04-25 20:55:42.712781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.712827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.055 [2024-04-25 20:55:42.712850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.712906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.055 [2024-04-25 20:55:42.712922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.055 [2024-04-25 20:55:42.712980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.055 [2024-04-25 20:55:42.713001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.316 #75 NEW cov: 11968 ft: 15142 corp: 27/941b lim: 50 exec/s: 75 rss: 70Mb L: 42/49 MS: 1 ChangeByte- 00:08:27.316 [2024-04-25 20:55:42.752562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.316 [2024-04-25 20:55:42.752588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.752639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.316 [2024-04-25 20:55:42.752656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.316 #76 NEW cov: 11968 ft: 15148 corp: 28/968b lim: 50 exec/s: 76 rss: 70Mb L: 27/49 MS: 1 EraseBytes- 00:08:27.316 [2024-04-25 20:55:42.793014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.316 [2024-04-25 20:55:42.793040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.793086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.316 [2024-04-25 20:55:42.793101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.793177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.316 [2024-04-25 20:55:42.793193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.793246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.316 [2024-04-25 20:55:42.793261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.316 #77 NEW cov: 11968 ft: 15177 corp: 29/1011b lim: 50 exec/s: 77 rss: 70Mb L: 43/49 MS: 1 CrossOver- 00:08:27.316 [2024-04-25 20:55:42.832653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.316 [2024-04-25 20:55:42.832679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.316 #78 NEW cov: 11968 ft: 15233 corp: 30/1025b lim: 50 exec/s: 78 rss: 70Mb L: 14/49 MS: 1 CrossOver- 00:08:27.316 [2024-04-25 20:55:42.873214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.316 [2024-04-25 20:55:42.873241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.873286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.316 [2024-04-25 20:55:42.873310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.873367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.316 [2024-04-25 20:55:42.873382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.873435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.316 [2024-04-25 20:55:42.873450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.316 #79 NEW cov: 11968 ft: 15241 corp: 31/1065b lim: 50 exec/s: 79 rss: 70Mb L: 40/49 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:27.316 [2024-04-25 20:55:42.913214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.316 [2024-04-25 20:55:42.913241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.913301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.316 [2024-04-25 20:55:42.913316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.913370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.316 [2024-04-25 20:55:42.913386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.316 #80 NEW cov: 11968 ft: 15248 corp: 32/1103b lim: 50 exec/s: 80 rss: 70Mb L: 38/49 MS: 1 ChangeBinInt- 00:08:27.316 [2024-04-25 20:55:42.953311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.316 [2024-04-25 20:55:42.953337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.953374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.316 [2024-04-25 20:55:42.953389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.316 [2024-04-25 20:55:42.953445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.316 [2024-04-25 20:55:42.953460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.576 #81 NEW cov: 11968 ft: 15271 corp: 33/1141b lim: 50 exec/s: 81 rss: 70Mb L: 38/49 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:27.576 [2024-04-25 20:55:42.993171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.576 [2024-04-25 20:55:42.993198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.576 #82 NEW cov: 11968 ft: 15285 corp: 34/1159b lim: 50 exec/s: 82 rss: 70Mb L: 18/49 MS: 1 CopyPart- 00:08:27.576 [2024-04-25 20:55:43.033867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.576 [2024-04-25 20:55:43.033895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.033946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.576 [2024-04-25 20:55:43.033961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.034019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.576 [2024-04-25 20:55:43.034033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.034086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.576 [2024-04-25 20:55:43.034102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.034157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:27.576 [2024-04-25 20:55:43.034172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.576 #83 NEW cov: 11968 ft: 15337 corp: 35/1209b lim: 50 exec/s: 83 rss: 70Mb L: 50/50 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:27.576 [2024-04-25 20:55:43.073644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.576 [2024-04-25 20:55:43.073672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.073714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.576 [2024-04-25 20:55:43.073729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.073805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.576 [2024-04-25 20:55:43.073822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.576 #84 NEW cov: 11968 ft: 15346 corp: 36/1247b lim: 50 exec/s: 84 rss: 70Mb L: 38/50 MS: 1 PersAutoDict- DE: "\377\003"- 00:08:27.576 [2024-04-25 20:55:43.113801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.576 [2024-04-25 20:55:43.113828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.113869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.576 [2024-04-25 20:55:43.113885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.113940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.576 [2024-04-25 20:55:43.113957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.576 #85 NEW cov: 11968 ft: 15365 corp: 37/1285b lim: 50 exec/s: 85 rss: 70Mb L: 38/50 MS: 1 CopyPart- 00:08:27.576 [2024-04-25 20:55:43.154219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.576 [2024-04-25 20:55:43.154246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.154291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.576 [2024-04-25 20:55:43.154313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.154366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.576 [2024-04-25 20:55:43.154379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.154432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.576 [2024-04-25 20:55:43.154448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.154503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:27.576 [2024-04-25 20:55:43.154519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.576 #86 NEW cov: 11968 ft: 15451 corp: 38/1335b lim: 50 exec/s: 86 rss: 70Mb L: 50/50 MS: 1 CMP- DE: "\001\000\000\037"- 00:08:27.576 [2024-04-25 20:55:43.193862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.576 [2024-04-25 20:55:43.193888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.576 [2024-04-25 20:55:43.193955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.576 [2024-04-25 20:55:43.193970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.577 #87 NEW cov: 11968 ft: 15464 corp: 39/1356b lim: 50 exec/s: 87 rss: 70Mb L: 21/50 MS: 1 ChangeBit- 00:08:27.577 [2024-04-25 20:55:43.233849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.577 [2024-04-25 20:55:43.233876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.836 #88 NEW cov: 11968 ft: 15553 corp: 40/1372b lim: 50 exec/s: 88 rss: 70Mb L: 16/50 MS: 1 EraseBytes- 00:08:27.836 [2024-04-25 20:55:43.284291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.836 [2024-04-25 20:55:43.284318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.836 [2024-04-25 20:55:43.284356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.836 [2024-04-25 20:55:43.284371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.836 [2024-04-25 20:55:43.284426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.836 [2024-04-25 20:55:43.284442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.836 #89 NEW cov: 11968 ft: 15575 corp: 41/1410b lim: 50 exec/s: 89 rss: 71Mb L: 38/50 MS: 1 ChangeBit- 00:08:27.836 [2024-04-25 20:55:43.324374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.836 [2024-04-25 20:55:43.324401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.836 [2024-04-25 20:55:43.324436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.836 [2024-04-25 20:55:43.324451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.836 [2024-04-25 20:55:43.324526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.836 [2024-04-25 20:55:43.324542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.836 #90 NEW cov: 11968 ft: 15596 corp: 42/1448b lim: 50 exec/s: 90 rss: 71Mb L: 38/50 MS: 1 ChangeBit- 00:08:27.836 [2024-04-25 20:55:43.364528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.836 [2024-04-25 20:55:43.364555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.836 [2024-04-25 20:55:43.364590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.836 [2024-04-25 20:55:43.364604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.836 [2024-04-25 20:55:43.364664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.836 [2024-04-25 20:55:43.364680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.836 #91 NEW cov: 11968 ft: 15659 corp: 43/1486b lim: 50 exec/s: 45 rss: 71Mb L: 38/50 MS: 1 ChangeBit- 00:08:27.836 #91 DONE cov: 11968 ft: 15659 corp: 43/1486b lim: 50 exec/s: 45 rss: 71Mb 00:08:27.836 ###### Recommended dictionary. ###### 00:08:27.836 "\377\003" # Uses: 5 00:08:27.836 "\001\000\000\000" # Uses: 2 00:08:27.836 "\001\000\000\037" # Uses: 0 00:08:27.836 ###### End of recommended dictionary. ###### 00:08:27.836 Done 91 runs in 2 second(s) 00:08:27.836 20:55:43 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:27.836 20:55:43 -- ../common.sh@72 -- # (( i++ )) 00:08:27.836 20:55:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.836 20:55:43 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:27.836 20:55:43 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:27.836 20:55:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.836 20:55:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.836 20:55:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.836 20:55:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:27.836 20:55:43 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:27.836 20:55:43 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:27.836 20:55:43 -- nvmf/run.sh@34 -- # printf %02d 22 00:08:27.836 20:55:43 -- nvmf/run.sh@34 -- # port=4422 00:08:27.836 20:55:43 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.096 20:55:43 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:28.096 20:55:43 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.096 20:55:43 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.096 20:55:43 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:28.096 20:55:43 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:28.096 [2024-04-25 20:55:43.516477] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:28.096 [2024-04-25 20:55:43.516533] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201921 ] 00:08:28.096 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.096 [2024-04-25 20:55:43.656382] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:28.096 [2024-04-25 20:55:43.693371] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.096 [2024-04-25 20:55:43.712557] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.355 [2024-04-25 20:55:43.764728] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.355 [2024-04-25 20:55:43.781065] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:28.355 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.355 INFO: Seed: 1051304869 00:08:28.355 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:28.355 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:28.355 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.355 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.355 #2 INITED exec/s: 0 rss: 60Mb 00:08:28.355 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.355 This may also happen if the target rejected all inputs we tried so far 00:08:28.355 [2024-04-25 20:55:43.836367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.355 [2024-04-25 20:55:43.836396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.355 [2024-04-25 20:55:43.836443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.355 [2024-04-25 20:55:43.836458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.355 [2024-04-25 20:55:43.836512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.355 [2024-04-25 20:55:43.836527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.615 NEW_FUNC[1/671]: 0x4cb3f0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:28.615 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.615 #8 NEW cov: 11741 ft: 11751 corp: 2/64b lim: 85 exec/s: 0 rss: 68Mb L: 63/63 MS: 1 InsertRepeatedBytes- 00:08:28.615 [2024-04-25 20:55:44.157214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.615 [2024-04-25 20:55:44.157254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.157319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.615 [2024-04-25 20:55:44.157340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.157403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.615 [2024-04-25 20:55:44.157422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.615 NEW_FUNC[1/1]: 0x1292e40 in nvmf_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/nvmf.c:153 00:08:28.615 #14 NEW cov: 11880 ft: 12281 corp: 3/127b lim: 85 exec/s: 0 rss: 68Mb L: 63/63 MS: 1 ShuffleBytes- 00:08:28.615 [2024-04-25 20:55:44.207250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.615 [2024-04-25 20:55:44.207276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.207319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.615 [2024-04-25 20:55:44.207333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.207389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.615 [2024-04-25 20:55:44.207405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.615 #15 NEW cov: 11886 ft: 12586 corp: 4/190b lim: 85 exec/s: 0 rss: 68Mb L: 63/63 MS: 1 ChangeBit- 00:08:28.615 [2024-04-25 20:55:44.247507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.615 [2024-04-25 20:55:44.247534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.247581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.615 [2024-04-25 20:55:44.247595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.247647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.615 [2024-04-25 20:55:44.247662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.615 [2024-04-25 20:55:44.247715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.615 [2024-04-25 20:55:44.247731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.615 #16 NEW cov: 11971 ft: 13147 corp: 5/274b lim: 85 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:08:28.875 [2024-04-25 20:55:44.297670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.875 [2024-04-25 20:55:44.297697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.297743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.875 [2024-04-25 20:55:44.297759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.297811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.875 [2024-04-25 20:55:44.297827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.297881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.875 [2024-04-25 20:55:44.297897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.875 #17 NEW cov: 11971 ft: 13279 corp: 6/358b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:28.875 [2024-04-25 20:55:44.337555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.875 [2024-04-25 20:55:44.337581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.337620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.875 [2024-04-25 20:55:44.337635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.337689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.875 [2024-04-25 20:55:44.337705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.875 #18 NEW cov: 11971 ft: 13333 corp: 7/425b lim: 85 exec/s: 0 rss: 69Mb L: 67/84 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:28.875 [2024-04-25 20:55:44.377709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.875 [2024-04-25 20:55:44.377735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.377779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.875 [2024-04-25 20:55:44.377797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.377852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.875 [2024-04-25 20:55:44.377868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.875 #19 NEW cov: 11971 ft: 13371 corp: 8/492b lim: 85 exec/s: 0 rss: 69Mb L: 67/84 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:28.875 [2024-04-25 20:55:44.417932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.875 [2024-04-25 20:55:44.417958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.418010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.875 [2024-04-25 20:55:44.418026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.418095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.875 [2024-04-25 20:55:44.418111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.418165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.875 [2024-04-25 20:55:44.418181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.875 #20 NEW cov: 11971 ft: 13414 corp: 9/576b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 1 ShuffleBytes- 00:08:28.875 [2024-04-25 20:55:44.458084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.875 [2024-04-25 20:55:44.458111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.458157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.875 [2024-04-25 20:55:44.458173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.458228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.875 [2024-04-25 20:55:44.458243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.458297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.875 [2024-04-25 20:55:44.458312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.875 #21 NEW cov: 11971 ft: 13460 corp: 10/647b lim: 85 exec/s: 0 rss: 69Mb L: 71/84 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:28.875 [2024-04-25 20:55:44.498354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.875 [2024-04-25 20:55:44.498381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.875 [2024-04-25 20:55:44.498427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.875 [2024-04-25 20:55:44.498442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.876 [2024-04-25 20:55:44.498496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.876 [2024-04-25 20:55:44.498512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.876 [2024-04-25 20:55:44.498568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.876 [2024-04-25 20:55:44.498582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.876 [2024-04-25 20:55:44.498636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:28.876 [2024-04-25 20:55:44.498651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.876 #22 NEW cov: 11971 ft: 13595 corp: 11/732b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 1 InsertByte- 00:08:29.136 [2024-04-25 20:55:44.538374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.538401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.538448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.538463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.538517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.538532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.538587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.136 [2024-04-25 20:55:44.538602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.136 #23 NEW cov: 11971 ft: 13625 corp: 12/800b lim: 85 exec/s: 0 rss: 69Mb L: 68/85 MS: 1 InsertByte- 00:08:29.136 [2024-04-25 20:55:44.578574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.578602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.578649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.578665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.578719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.578734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.578788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.136 [2024-04-25 20:55:44.578804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.578860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:29.136 [2024-04-25 20:55:44.578875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.136 #24 NEW cov: 11971 ft: 13640 corp: 13/885b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 1 CopyPart- 00:08:29.136 [2024-04-25 20:55:44.618705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.618732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.618782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.618799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.618855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.618870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.618925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.136 [2024-04-25 20:55:44.618940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.618997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:29.136 [2024-04-25 20:55:44.619013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.136 #25 NEW cov: 11971 ft: 13675 corp: 14/970b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 1 InsertByte- 00:08:29.136 [2024-04-25 20:55:44.658824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.658851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.658903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.658919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.658971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.658987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.659045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.136 [2024-04-25 20:55:44.659061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.659115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:29.136 [2024-04-25 20:55:44.659131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.136 #26 NEW cov: 11971 ft: 13696 corp: 15/1055b lim: 85 exec/s: 0 rss: 69Mb L: 85/85 MS: 1 ChangeBit- 00:08:29.136 [2024-04-25 20:55:44.698612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.698638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.698681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.698697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.698752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.698767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.136 #27 NEW cov: 11994 ft: 13721 corp: 16/1118b lim: 85 exec/s: 0 rss: 69Mb L: 63/85 MS: 1 ChangeBinInt- 00:08:29.136 [2024-04-25 20:55:44.738851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.738877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.738923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.738941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.738996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.739012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.739068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.136 [2024-04-25 20:55:44.739084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.136 #28 NEW cov: 11994 ft: 13759 corp: 17/1202b lim: 85 exec/s: 0 rss: 69Mb L: 84/85 MS: 1 ChangeBinInt- 00:08:29.136 [2024-04-25 20:55:44.778982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.136 [2024-04-25 20:55:44.779016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.779062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.136 [2024-04-25 20:55:44.779077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.779130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.136 [2024-04-25 20:55:44.779145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.136 [2024-04-25 20:55:44.779197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.136 [2024-04-25 20:55:44.779213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.396 #29 NEW cov: 11994 ft: 13770 corp: 18/1275b lim: 85 exec/s: 0 rss: 69Mb L: 73/85 MS: 1 CrossOver- 00:08:29.396 [2024-04-25 20:55:44.818950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.396 [2024-04-25 20:55:44.818977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.819019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.396 [2024-04-25 20:55:44.819034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.819090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.396 [2024-04-25 20:55:44.819106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.396 #30 NEW cov: 11994 ft: 13779 corp: 19/1327b lim: 85 exec/s: 30 rss: 70Mb L: 52/85 MS: 1 CrossOver- 00:08:29.396 [2024-04-25 20:55:44.859218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.396 [2024-04-25 20:55:44.859244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.859290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.396 [2024-04-25 20:55:44.859305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.859359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.396 [2024-04-25 20:55:44.859375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.859429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.396 [2024-04-25 20:55:44.859445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.396 #31 NEW cov: 11994 ft: 13792 corp: 20/1411b lim: 85 exec/s: 31 rss: 70Mb L: 84/85 MS: 1 ChangeBinInt- 00:08:29.396 [2024-04-25 20:55:44.899322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.396 [2024-04-25 20:55:44.899349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.899398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.396 [2024-04-25 20:55:44.899413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.899466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.396 [2024-04-25 20:55:44.899481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.899536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.396 [2024-04-25 20:55:44.899552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.396 #32 NEW cov: 11994 ft: 13816 corp: 21/1495b lim: 85 exec/s: 32 rss: 70Mb L: 84/85 MS: 1 CopyPart- 00:08:29.396 [2024-04-25 20:55:44.939160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.396 [2024-04-25 20:55:44.939188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.939242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.396 [2024-04-25 20:55:44.939257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.396 #33 NEW cov: 11994 ft: 14173 corp: 22/1534b lim: 85 exec/s: 33 rss: 70Mb L: 39/85 MS: 1 EraseBytes- 00:08:29.396 [2024-04-25 20:55:44.979586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.396 [2024-04-25 20:55:44.979614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.979662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.396 [2024-04-25 20:55:44.979678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.979731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.396 [2024-04-25 20:55:44.979747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:44.979803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.396 [2024-04-25 20:55:44.979819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.396 #34 NEW cov: 11994 ft: 14203 corp: 23/1602b lim: 85 exec/s: 34 rss: 70Mb L: 68/85 MS: 1 InsertByte- 00:08:29.396 [2024-04-25 20:55:45.019710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.396 [2024-04-25 20:55:45.019736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:45.019776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.396 [2024-04-25 20:55:45.019794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:45.019847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.396 [2024-04-25 20:55:45.019864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.396 [2024-04-25 20:55:45.019922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.396 [2024-04-25 20:55:45.019939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.396 #35 NEW cov: 11994 ft: 14228 corp: 24/1676b lim: 85 exec/s: 35 rss: 70Mb L: 74/85 MS: 1 InsertByte- 00:08:29.656 [2024-04-25 20:55:45.059812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.656 [2024-04-25 20:55:45.059840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.656 [2024-04-25 20:55:45.059887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.656 [2024-04-25 20:55:45.059903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.656 [2024-04-25 20:55:45.059953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.656 [2024-04-25 20:55:45.059968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.656 [2024-04-25 20:55:45.060040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.656 [2024-04-25 20:55:45.060056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.656 #36 NEW cov: 11994 ft: 14241 corp: 25/1750b lim: 85 exec/s: 36 rss: 70Mb L: 74/85 MS: 1 CopyPart- 00:08:29.657 [2024-04-25 20:55:45.100067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.657 [2024-04-25 20:55:45.100094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.100147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.657 [2024-04-25 20:55:45.100162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.100215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.657 [2024-04-25 20:55:45.100231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.100285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.657 [2024-04-25 20:55:45.100301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.100360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:29.657 [2024-04-25 20:55:45.100376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.657 #37 NEW cov: 11994 ft: 14255 corp: 26/1835b lim: 85 exec/s: 37 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:08:29.657 [2024-04-25 20:55:45.149915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.657 [2024-04-25 20:55:45.149942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.149978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.657 [2024-04-25 20:55:45.150001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.150056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.657 [2024-04-25 20:55:45.150071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.657 #38 NEW cov: 11994 ft: 14268 corp: 27/1898b lim: 85 exec/s: 38 rss: 70Mb L: 63/85 MS: 1 ChangeBinInt- 00:08:29.657 [2024-04-25 20:55:45.190352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.657 [2024-04-25 20:55:45.190380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.190434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.657 [2024-04-25 20:55:45.190449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.190500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.657 [2024-04-25 20:55:45.190515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.190567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.657 [2024-04-25 20:55:45.190582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.190634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:29.657 [2024-04-25 20:55:45.190650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.657 #39 NEW cov: 11994 ft: 14349 corp: 28/1983b lim: 85 exec/s: 39 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:08:29.657 [2024-04-25 20:55:45.230289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.657 [2024-04-25 20:55:45.230316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.230361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.657 [2024-04-25 20:55:45.230377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.230431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.657 [2024-04-25 20:55:45.230446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.230516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.657 [2024-04-25 20:55:45.230532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.657 #40 NEW cov: 11994 ft: 14355 corp: 29/2051b lim: 85 exec/s: 40 rss: 70Mb L: 68/85 MS: 1 InsertByte- 00:08:29.657 [2024-04-25 20:55:45.270242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.657 [2024-04-25 20:55:45.270268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.270304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.657 [2024-04-25 20:55:45.270319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.270375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.657 [2024-04-25 20:55:45.270390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.657 #41 NEW cov: 11994 ft: 14419 corp: 30/2114b lim: 85 exec/s: 41 rss: 70Mb L: 63/85 MS: 1 ChangeByte- 00:08:29.657 [2024-04-25 20:55:45.310559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.657 [2024-04-25 20:55:45.310585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.310623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.657 [2024-04-25 20:55:45.310639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.310691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.657 [2024-04-25 20:55:45.310706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.657 [2024-04-25 20:55:45.310758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.657 [2024-04-25 20:55:45.310774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 #42 NEW cov: 11994 ft: 14438 corp: 31/2182b lim: 85 exec/s: 42 rss: 70Mb L: 68/85 MS: 1 EraseBytes- 00:08:29.917 [2024-04-25 20:55:45.350761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.917 [2024-04-25 20:55:45.350788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.350838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.917 [2024-04-25 20:55:45.350853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.350907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.917 [2024-04-25 20:55:45.350923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.350977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.917 [2024-04-25 20:55:45.350998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.351052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:29.917 [2024-04-25 20:55:45.351068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:29.917 #43 NEW cov: 11994 ft: 14481 corp: 32/2267b lim: 85 exec/s: 43 rss: 70Mb L: 85/85 MS: 1 ShuffleBytes- 00:08:29.917 [2024-04-25 20:55:45.390708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.917 [2024-04-25 20:55:45.390735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.390782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.917 [2024-04-25 20:55:45.390797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.390849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.917 [2024-04-25 20:55:45.390865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.390922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.917 [2024-04-25 20:55:45.390937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 #44 NEW cov: 11994 ft: 14488 corp: 33/2351b lim: 85 exec/s: 44 rss: 70Mb L: 84/85 MS: 1 ShuffleBytes- 00:08:29.917 [2024-04-25 20:55:45.430852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.917 [2024-04-25 20:55:45.430878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.430929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.917 [2024-04-25 20:55:45.430943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.430998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.917 [2024-04-25 20:55:45.431013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.431070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.917 [2024-04-25 20:55:45.431085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 #45 NEW cov: 11994 ft: 14490 corp: 34/2434b lim: 85 exec/s: 45 rss: 70Mb L: 83/85 MS: 1 CopyPart- 00:08:29.917 [2024-04-25 20:55:45.470951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.917 [2024-04-25 20:55:45.470978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.471030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.917 [2024-04-25 20:55:45.471046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.471100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.917 [2024-04-25 20:55:45.471115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.471170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.917 [2024-04-25 20:55:45.471185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 #46 NEW cov: 11994 ft: 14500 corp: 35/2508b lim: 85 exec/s: 46 rss: 70Mb L: 74/85 MS: 1 ChangeBit- 00:08:29.917 [2024-04-25 20:55:45.511070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.917 [2024-04-25 20:55:45.511096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.511142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.917 [2024-04-25 20:55:45.511157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.511211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.917 [2024-04-25 20:55:45.511226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.511298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.917 [2024-04-25 20:55:45.511314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 #47 NEW cov: 11994 ft: 14514 corp: 36/2592b lim: 85 exec/s: 47 rss: 70Mb L: 84/85 MS: 1 ShuffleBytes- 00:08:29.917 [2024-04-25 20:55:45.551190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.917 [2024-04-25 20:55:45.551217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.551265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.917 [2024-04-25 20:55:45.551279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.551334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.917 [2024-04-25 20:55:45.551348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.917 [2024-04-25 20:55:45.551403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.917 [2024-04-25 20:55:45.551418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.917 #48 NEW cov: 11994 ft: 14531 corp: 37/2666b lim: 85 exec/s: 48 rss: 70Mb L: 74/85 MS: 1 ChangeBit- 00:08:30.176 [2024-04-25 20:55:45.591295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.176 [2024-04-25 20:55:45.591322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.591368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.176 [2024-04-25 20:55:45.591384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.591440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.176 [2024-04-25 20:55:45.591455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.591509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.176 [2024-04-25 20:55:45.591526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.176 #49 NEW cov: 11994 ft: 14562 corp: 38/2740b lim: 85 exec/s: 49 rss: 70Mb L: 74/85 MS: 1 ShuffleBytes- 00:08:30.176 [2024-04-25 20:55:45.631579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.176 [2024-04-25 20:55:45.631606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.631660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.176 [2024-04-25 20:55:45.631676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.631730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.176 [2024-04-25 20:55:45.631746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.631799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.176 [2024-04-25 20:55:45.631814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.631872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:30.176 [2024-04-25 20:55:45.631891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.176 #50 NEW cov: 11994 ft: 14571 corp: 39/2825b lim: 85 exec/s: 50 rss: 70Mb L: 85/85 MS: 1 CrossOver- 00:08:30.176 [2024-04-25 20:55:45.671680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.176 [2024-04-25 20:55:45.671707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.671761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.176 [2024-04-25 20:55:45.671777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.671832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.176 [2024-04-25 20:55:45.671848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.671899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.176 [2024-04-25 20:55:45.671914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.671968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:30.176 [2024-04-25 20:55:45.671983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.176 #51 NEW cov: 11994 ft: 14579 corp: 40/2910b lim: 85 exec/s: 51 rss: 70Mb L: 85/85 MS: 1 ChangeBinInt- 00:08:30.176 [2024-04-25 20:55:45.711647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.176 [2024-04-25 20:55:45.711673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.711722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.176 [2024-04-25 20:55:45.711738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.711791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.176 [2024-04-25 20:55:45.711806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.711862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.176 [2024-04-25 20:55:45.711877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.176 #52 NEW cov: 11994 ft: 14609 corp: 41/2978b lim: 85 exec/s: 52 rss: 70Mb L: 68/85 MS: 1 ChangeBit- 00:08:30.176 [2024-04-25 20:55:45.751804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.176 [2024-04-25 20:55:45.751831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.176 [2024-04-25 20:55:45.751877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.177 [2024-04-25 20:55:45.751892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.751946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.177 [2024-04-25 20:55:45.751962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.752018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.177 [2024-04-25 20:55:45.752036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.177 #53 NEW cov: 11994 ft: 14610 corp: 42/3062b lim: 85 exec/s: 53 rss: 70Mb L: 84/85 MS: 1 ChangeBinInt- 00:08:30.177 [2024-04-25 20:55:45.791934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.177 [2024-04-25 20:55:45.791961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.792007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.177 [2024-04-25 20:55:45.792024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.792093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.177 [2024-04-25 20:55:45.792108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.792162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.177 [2024-04-25 20:55:45.792178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.177 #54 NEW cov: 11994 ft: 14626 corp: 43/3135b lim: 85 exec/s: 54 rss: 70Mb L: 73/85 MS: 1 ChangeBinInt- 00:08:30.177 [2024-04-25 20:55:45.832187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.177 [2024-04-25 20:55:45.832215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.832266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.177 [2024-04-25 20:55:45.832282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.832334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.177 [2024-04-25 20:55:45.832350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.832404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.177 [2024-04-25 20:55:45.832419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.177 [2024-04-25 20:55:45.832474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:30.177 [2024-04-25 20:55:45.832489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.435 #55 NEW cov: 11994 ft: 14663 corp: 44/3220b lim: 85 exec/s: 27 rss: 71Mb L: 85/85 MS: 1 InsertByte- 00:08:30.435 #55 DONE cov: 11994 ft: 14663 corp: 44/3220b lim: 85 exec/s: 27 rss: 71Mb 00:08:30.435 ###### Recommended dictionary. ###### 00:08:30.435 "\377\377\377\377" # Uses: 2 00:08:30.435 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:30.435 ###### End of recommended dictionary. ###### 00:08:30.435 Done 55 runs in 2 second(s) 00:08:30.435 20:55:45 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.435 20:55:45 -- ../common.sh@72 -- # (( i++ )) 00:08:30.435 20:55:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.435 20:55:45 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:30.435 20:55:45 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:30.435 20:55:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.435 20:55:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.435 20:55:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.435 20:55:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:30.435 20:55:45 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.435 20:55:45 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.435 20:55:45 -- nvmf/run.sh@34 -- # printf %02d 23 00:08:30.435 20:55:45 -- nvmf/run.sh@34 -- # port=4423 00:08:30.435 20:55:45 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.435 20:55:45 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:30.436 20:55:45 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.436 20:55:45 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.436 20:55:45 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.436 20:55:45 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:30.436 [2024-04-25 20:55:46.003620] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:30.436 [2024-04-25 20:55:46.003698] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202322 ] 00:08:30.436 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.695 [2024-04-25 20:55:46.143091] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:30.695 [2024-04-25 20:55:46.181481] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.695 [2024-04-25 20:55:46.200770] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.695 [2024-04-25 20:55:46.252759] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.695 [2024-04-25 20:55:46.269252] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:30.695 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.695 INFO: Seed: 3538303443 00:08:30.695 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:30.695 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:30.695 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.695 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.695 #2 INITED exec/s: 0 rss: 61Mb 00:08:30.695 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.695 This may also happen if the target rejected all inputs we tried so far 00:08:30.695 [2024-04-25 20:55:46.314412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.695 [2024-04-25 20:55:46.314442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.695 [2024-04-25 20:55:46.314493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.695 [2024-04-25 20:55:46.314508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.695 [2024-04-25 20:55:46.314567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.695 [2024-04-25 20:55:46.314583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.264 NEW_FUNC[1/671]: 0x4ce620 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:31.264 NEW_FUNC[2/671]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.264 #5 NEW cov: 11683 ft: 11684 corp: 2/18b lim: 25 exec/s: 0 rss: 68Mb L: 17/17 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:31.264 [2024-04-25 20:55:46.635233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.635275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.635343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.635374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.635432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.264 [2024-04-25 20:55:46.635447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.264 #6 NEW cov: 11813 ft: 12187 corp: 3/35b lim: 25 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 CopyPart- 00:08:31.264 [2024-04-25 20:55:46.685158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.685186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.685241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.685257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 #7 NEW cov: 11819 ft: 12650 corp: 4/46b lim: 25 exec/s: 0 rss: 68Mb L: 11/17 MS: 1 EraseBytes- 00:08:31.264 [2024-04-25 20:55:46.725369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.725396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.725436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.725452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.725511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.264 [2024-04-25 20:55:46.725527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.264 #8 NEW cov: 11904 ft: 12961 corp: 5/63b lim: 25 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 ChangeByte- 00:08:31.264 [2024-04-25 20:55:46.765688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.765714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.765770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.765786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.765842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.264 [2024-04-25 20:55:46.765858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.765914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.264 [2024-04-25 20:55:46.765931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.264 #9 NEW cov: 11904 ft: 13504 corp: 6/86b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 CrossOver- 00:08:31.264 [2024-04-25 20:55:46.805730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.805759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.805805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.805821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.805881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.264 [2024-04-25 20:55:46.805895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.805952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.264 [2024-04-25 20:55:46.805968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.264 #14 NEW cov: 11904 ft: 13547 corp: 7/109b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 5 ChangeByte-ChangeBit-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:31.264 [2024-04-25 20:55:46.845749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.845775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.845815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.845831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.845890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.264 [2024-04-25 20:55:46.845906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.264 #17 NEW cov: 11904 ft: 13663 corp: 8/126b lim: 25 exec/s: 0 rss: 69Mb L: 17/23 MS: 3 InsertByte-CopyPart-CrossOver- 00:08:31.264 [2024-04-25 20:55:46.885747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.885773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.885815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.885830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.264 #23 NEW cov: 11904 ft: 13738 corp: 9/137b lim: 25 exec/s: 0 rss: 69Mb L: 11/23 MS: 1 ChangeBit- 00:08:31.264 [2024-04-25 20:55:46.926048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.264 [2024-04-25 20:55:46.926075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.264 [2024-04-25 20:55:46.926126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.264 [2024-04-25 20:55:46.926142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:46.926204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.524 [2024-04-25 20:55:46.926220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.524 #24 NEW cov: 11904 ft: 13764 corp: 10/154b lim: 25 exec/s: 0 rss: 69Mb L: 17/23 MS: 1 ChangeBit- 00:08:31.524 [2024-04-25 20:55:46.965965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.524 [2024-04-25 20:55:46.965991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:46.966042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.524 [2024-04-25 20:55:46.966057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 #25 NEW cov: 11904 ft: 13838 corp: 11/165b lim: 25 exec/s: 0 rss: 69Mb L: 11/23 MS: 1 ShuffleBytes- 00:08:31.524 [2024-04-25 20:55:47.006178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.524 [2024-04-25 20:55:47.006204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.006249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.524 [2024-04-25 20:55:47.006263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.006321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.524 [2024-04-25 20:55:47.006336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.524 #26 NEW cov: 11904 ft: 13856 corp: 12/182b lim: 25 exec/s: 0 rss: 69Mb L: 17/23 MS: 1 ChangeBit- 00:08:31.524 [2024-04-25 20:55:47.046450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.524 [2024-04-25 20:55:47.046476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.046533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.524 [2024-04-25 20:55:47.046548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.046606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.524 [2024-04-25 20:55:47.046621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.046682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.524 [2024-04-25 20:55:47.046697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.524 #27 NEW cov: 11904 ft: 13971 corp: 13/205b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:31.524 [2024-04-25 20:55:47.086464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.524 [2024-04-25 20:55:47.086492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.086533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.524 [2024-04-25 20:55:47.086548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.086607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.524 [2024-04-25 20:55:47.086622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.524 #28 NEW cov: 11904 ft: 13978 corp: 14/223b lim: 25 exec/s: 0 rss: 70Mb L: 18/23 MS: 1 InsertByte- 00:08:31.524 [2024-04-25 20:55:47.126698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.524 [2024-04-25 20:55:47.126725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.126776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.524 [2024-04-25 20:55:47.126794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.126853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.524 [2024-04-25 20:55:47.126868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.126926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.524 [2024-04-25 20:55:47.126941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.524 #29 NEW cov: 11904 ft: 13990 corp: 15/245b lim: 25 exec/s: 0 rss: 70Mb L: 22/23 MS: 1 InsertRepeatedBytes- 00:08:31.524 [2024-04-25 20:55:47.166690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.524 [2024-04-25 20:55:47.166717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.166767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.524 [2024-04-25 20:55:47.166782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.524 [2024-04-25 20:55:47.166840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.524 [2024-04-25 20:55:47.166856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.524 #30 NEW cov: 11904 ft: 14005 corp: 16/263b lim: 25 exec/s: 0 rss: 70Mb L: 18/23 MS: 1 InsertByte- 00:08:31.783 [2024-04-25 20:55:47.206789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.783 [2024-04-25 20:55:47.206816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.206857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.783 [2024-04-25 20:55:47.206874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.206932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.783 [2024-04-25 20:55:47.206949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.783 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.783 #31 NEW cov: 11927 ft: 14155 corp: 17/280b lim: 25 exec/s: 0 rss: 70Mb L: 17/23 MS: 1 ShuffleBytes- 00:08:31.783 [2024-04-25 20:55:47.246894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.783 [2024-04-25 20:55:47.246921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.246969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.783 [2024-04-25 20:55:47.246986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.247052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.783 [2024-04-25 20:55:47.247068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.783 #32 NEW cov: 11927 ft: 14182 corp: 18/297b lim: 25 exec/s: 0 rss: 70Mb L: 17/23 MS: 1 ChangeBit- 00:08:31.783 [2024-04-25 20:55:47.286873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.783 [2024-04-25 20:55:47.286904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.286961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.783 [2024-04-25 20:55:47.286977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.783 #33 NEW cov: 11927 ft: 14191 corp: 19/308b lim: 25 exec/s: 33 rss: 70Mb L: 11/23 MS: 1 ChangeBit- 00:08:31.783 [2024-04-25 20:55:47.327236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.783 [2024-04-25 20:55:47.327263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.327312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.783 [2024-04-25 20:55:47.327328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.327386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.783 [2024-04-25 20:55:47.327402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.327463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.783 [2024-04-25 20:55:47.327479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.783 #34 NEW cov: 11927 ft: 14202 corp: 20/329b lim: 25 exec/s: 34 rss: 70Mb L: 21/23 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:31.783 [2024-04-25 20:55:47.367112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.783 [2024-04-25 20:55:47.367138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.367181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.783 [2024-04-25 20:55:47.367197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.783 #35 NEW cov: 11927 ft: 14297 corp: 21/341b lim: 25 exec/s: 35 rss: 70Mb L: 12/23 MS: 1 InsertByte- 00:08:31.783 [2024-04-25 20:55:47.407218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.783 [2024-04-25 20:55:47.407244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.783 [2024-04-25 20:55:47.407285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.783 [2024-04-25 20:55:47.407301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.783 #36 NEW cov: 11927 ft: 14479 corp: 22/352b lim: 25 exec/s: 36 rss: 70Mb L: 11/23 MS: 1 ChangeByte- 00:08:32.043 [2024-04-25 20:55:47.447578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.447605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.447653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.447668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.447729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.043 [2024-04-25 20:55:47.447745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.447807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.043 [2024-04-25 20:55:47.447823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.043 #37 NEW cov: 11927 ft: 14486 corp: 23/374b lim: 25 exec/s: 37 rss: 70Mb L: 22/23 MS: 1 InsertByte- 00:08:32.043 [2024-04-25 20:55:47.487547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.487574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.487616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.487632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.487692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.043 [2024-04-25 20:55:47.487709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.043 #38 NEW cov: 11927 ft: 14492 corp: 24/390b lim: 25 exec/s: 38 rss: 70Mb L: 16/23 MS: 1 EraseBytes- 00:08:32.043 [2024-04-25 20:55:47.527851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.527878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.527927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.527942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.528002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.043 [2024-04-25 20:55:47.528018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.528078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.043 [2024-04-25 20:55:47.528094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.043 #39 NEW cov: 11927 ft: 14503 corp: 25/412b lim: 25 exec/s: 39 rss: 70Mb L: 22/23 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:32.043 [2024-04-25 20:55:47.567962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.567988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.568050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.568066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.568123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.043 [2024-04-25 20:55:47.568139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.568199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.043 [2024-04-25 20:55:47.568215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.043 #40 NEW cov: 11927 ft: 14505 corp: 26/435b lim: 25 exec/s: 40 rss: 70Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:32.043 [2024-04-25 20:55:47.607754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.607783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.607842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.607859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 #41 NEW cov: 11927 ft: 14524 corp: 27/447b lim: 25 exec/s: 41 rss: 70Mb L: 12/23 MS: 1 InsertByte- 00:08:32.043 [2024-04-25 20:55:47.647882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.647909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.647966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.647982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 #46 NEW cov: 11927 ft: 14567 corp: 28/458b lim: 25 exec/s: 46 rss: 70Mb L: 11/23 MS: 5 InsertByte-ShuffleBytes-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:32.043 [2024-04-25 20:55:47.688145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.043 [2024-04-25 20:55:47.688171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.688234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.043 [2024-04-25 20:55:47.688250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.043 [2024-04-25 20:55:47.688309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.043 [2024-04-25 20:55:47.688324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 #47 NEW cov: 11927 ft: 14610 corp: 29/476b lim: 25 exec/s: 47 rss: 70Mb L: 18/23 MS: 1 ChangeBinInt- 00:08:32.302 [2024-04-25 20:55:47.728286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.302 [2024-04-25 20:55:47.728313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.728369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.302 [2024-04-25 20:55:47.728385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.728446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.302 [2024-04-25 20:55:47.728462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 #48 NEW cov: 11927 ft: 14620 corp: 30/493b lim: 25 exec/s: 48 rss: 70Mb L: 17/23 MS: 1 ChangeBit- 00:08:32.302 [2024-04-25 20:55:47.768385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.302 [2024-04-25 20:55:47.768411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.768451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.302 [2024-04-25 20:55:47.768467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.768528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.302 [2024-04-25 20:55:47.768544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 #49 NEW cov: 11927 ft: 14638 corp: 31/510b lim: 25 exec/s: 49 rss: 70Mb L: 17/23 MS: 1 ChangeBinInt- 00:08:32.302 [2024-04-25 20:55:47.808625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.302 [2024-04-25 20:55:47.808652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.808709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.302 [2024-04-25 20:55:47.808726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.808785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.302 [2024-04-25 20:55:47.808801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.808861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.302 [2024-04-25 20:55:47.808877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.302 #50 NEW cov: 11927 ft: 14676 corp: 32/533b lim: 25 exec/s: 50 rss: 70Mb L: 23/23 MS: 1 ChangeBit- 00:08:32.302 [2024-04-25 20:55:47.848609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.302 [2024-04-25 20:55:47.848636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.848677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.302 [2024-04-25 20:55:47.848693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.848751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.302 [2024-04-25 20:55:47.848768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 #51 NEW cov: 11927 ft: 14685 corp: 33/549b lim: 25 exec/s: 51 rss: 70Mb L: 16/23 MS: 1 CMP- DE: "\012\361\213\352\013\375v\000"- 00:08:32.302 [2024-04-25 20:55:47.888963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.302 [2024-04-25 20:55:47.888996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.889057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.302 [2024-04-25 20:55:47.889072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.889130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.302 [2024-04-25 20:55:47.889145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.889205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.302 [2024-04-25 20:55:47.889220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.889279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:32.302 [2024-04-25 20:55:47.889294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.302 #52 NEW cov: 11927 ft: 14754 corp: 34/574b lim: 25 exec/s: 52 rss: 70Mb L: 25/25 MS: 1 CopyPart- 00:08:32.302 [2024-04-25 20:55:47.928826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.302 [2024-04-25 20:55:47.928857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.928917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.302 [2024-04-25 20:55:47.928934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.302 [2024-04-25 20:55:47.928997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.302 [2024-04-25 20:55:47.929013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.302 #53 NEW cov: 11927 ft: 14800 corp: 35/592b lim: 25 exec/s: 53 rss: 70Mb L: 18/25 MS: 1 ShuffleBytes- 00:08:32.561 [2024-04-25 20:55:47.968792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:47.968819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:47.968876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:47.968893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 #54 NEW cov: 11927 ft: 14810 corp: 36/604b lim: 25 exec/s: 54 rss: 70Mb L: 12/25 MS: 1 ChangeByte- 00:08:32.561 [2024-04-25 20:55:48.009064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:48.009090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.009139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:48.009154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.009214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.561 [2024-04-25 20:55:48.009230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.561 #55 NEW cov: 11927 ft: 14821 corp: 37/619b lim: 25 exec/s: 55 rss: 70Mb L: 15/25 MS: 1 EraseBytes- 00:08:32.561 [2024-04-25 20:55:48.049052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:48.049078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.049131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:48.049147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 #56 NEW cov: 11927 ft: 14824 corp: 38/630b lim: 25 exec/s: 56 rss: 70Mb L: 11/25 MS: 1 ShuffleBytes- 00:08:32.561 [2024-04-25 20:55:48.089303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:48.089329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.089370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:48.089385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.089447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.561 [2024-04-25 20:55:48.089463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.561 #57 NEW cov: 11927 ft: 14834 corp: 39/649b lim: 25 exec/s: 57 rss: 70Mb L: 19/25 MS: 1 EraseBytes- 00:08:32.561 [2024-04-25 20:55:48.129416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:48.129442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.129482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:48.129498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.129555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.561 [2024-04-25 20:55:48.129571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.561 #58 NEW cov: 11927 ft: 14836 corp: 40/666b lim: 25 exec/s: 58 rss: 71Mb L: 17/25 MS: 1 ChangeBit- 00:08:32.561 [2024-04-25 20:55:48.169651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:48.169678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.169722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:48.169739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.169796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.561 [2024-04-25 20:55:48.169812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.169872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.561 [2024-04-25 20:55:48.169888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.561 #59 NEW cov: 11927 ft: 14849 corp: 41/689b lim: 25 exec/s: 59 rss: 71Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:32.561 [2024-04-25 20:55:48.209539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.561 [2024-04-25 20:55:48.209567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.561 [2024-04-25 20:55:48.209614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.561 [2024-04-25 20:55:48.209629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.821 #60 NEW cov: 11927 ft: 14859 corp: 42/701b lim: 25 exec/s: 60 rss: 71Mb L: 12/25 MS: 1 ChangeByte- 00:08:32.821 [2024-04-25 20:55:48.249912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.821 [2024-04-25 20:55:48.249939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.821 [2024-04-25 20:55:48.250000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.821 [2024-04-25 20:55:48.250016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.821 [2024-04-25 20:55:48.250071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.821 [2024-04-25 20:55:48.250088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.821 [2024-04-25 20:55:48.250147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.821 [2024-04-25 20:55:48.250166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.821 #61 NEW cov: 11927 ft: 14867 corp: 43/722b lim: 25 exec/s: 61 rss: 71Mb L: 21/25 MS: 1 CopyPart- 00:08:32.821 [2024-04-25 20:55:48.289763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.821 [2024-04-25 20:55:48.289789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.821 [2024-04-25 20:55:48.289837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.821 [2024-04-25 20:55:48.289853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.821 #62 NEW cov: 11927 ft: 14894 corp: 44/733b lim: 25 exec/s: 31 rss: 71Mb L: 11/25 MS: 1 ChangeByte- 00:08:32.821 #62 DONE cov: 11927 ft: 14894 corp: 44/733b lim: 25 exec/s: 31 rss: 71Mb 00:08:32.821 ###### Recommended dictionary. ###### 00:08:32.821 "\377\377\377\377" # Uses: 1 00:08:32.821 "\012\361\213\352\013\375v\000" # Uses: 0 00:08:32.821 ###### End of recommended dictionary. ###### 00:08:32.821 Done 62 runs in 2 second(s) 00:08:32.821 20:55:48 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:32.821 20:55:48 -- ../common.sh@72 -- # (( i++ )) 00:08:32.821 20:55:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.821 20:55:48 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:32.821 20:55:48 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:32.821 20:55:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.821 20:55:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.821 20:55:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.821 20:55:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:32.821 20:55:48 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:32.821 20:55:48 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:32.821 20:55:48 -- nvmf/run.sh@34 -- # printf %02d 24 00:08:32.821 20:55:48 -- nvmf/run.sh@34 -- # port=4424 00:08:32.821 20:55:48 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.821 20:55:48 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:32.821 20:55:48 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.821 20:55:48 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:32.821 20:55:48 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:32.822 20:55:48 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:32.822 [2024-04-25 20:55:48.454583] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:32.822 [2024-04-25 20:55:48.454660] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202861 ] 00:08:33.081 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.081 [2024-04-25 20:55:48.594047] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:33.081 [2024-04-25 20:55:48.631324] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.081 [2024-04-25 20:55:48.651667] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.081 [2024-04-25 20:55:48.703960] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.081 [2024-04-25 20:55:48.720281] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:33.081 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.081 INFO: Seed: 1696341486 00:08:33.341 INFO: Loaded 1 modules (348579 inline 8-bit counters): 348579 [0x2764bcc, 0x27b9d6f), 00:08:33.341 INFO: Loaded 1 PC tables (348579 PCs): 348579 [0x27b9d70,0x2d0b7a0), 00:08:33.341 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.341 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.341 #2 INITED exec/s: 0 rss: 61Mb 00:08:33.341 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.341 This may also happen if the target rejected all inputs we tried so far 00:08:33.341 [2024-04-25 20:55:48.775327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.341 [2024-04-25 20:55:48.775356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.601 NEW_FUNC[1/672]: 0x4cf700 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:33.601 NEW_FUNC[2/672]: 0x4e0360 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.601 #20 NEW cov: 11755 ft: 11755 corp: 2/34b lim: 100 exec/s: 0 rss: 68Mb L: 33/33 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:33.601 [2024-04-25 20:55:49.096181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.601 [2024-04-25 20:55:49.096213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.601 #22 NEW cov: 11885 ft: 12209 corp: 3/69b lim: 100 exec/s: 0 rss: 68Mb L: 35/35 MS: 2 InsertByte-CrossOver- 00:08:33.601 [2024-04-25 20:55:49.136241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.601 [2024-04-25 20:55:49.136269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.601 #23 NEW cov: 11891 ft: 12537 corp: 4/104b lim: 100 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:33.601 [2024-04-25 20:55:49.176308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.601 [2024-04-25 20:55:49.176336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.601 #24 NEW cov: 11976 ft: 12824 corp: 5/139b lim: 100 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:33.601 [2024-04-25 20:55:49.216480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.601 [2024-04-25 20:55:49.216507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.601 #25 NEW cov: 11976 ft: 12907 corp: 6/174b lim: 100 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:33.601 [2024-04-25 20:55:49.246542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.601 [2024-04-25 20:55:49.246569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.860 #26 NEW cov: 11976 ft: 12948 corp: 7/209b lim: 100 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBit- 00:08:33.860 [2024-04-25 20:55:49.286684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.860 [2024-04-25 20:55:49.286713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.860 #27 NEW cov: 11976 ft: 13060 corp: 8/244b lim: 100 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:33.860 [2024-04-25 20:55:49.326918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16426893921767943008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.860 [2024-04-25 20:55:49.326948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.860 [2024-04-25 20:55:49.327039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.327060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.861 #28 NEW cov: 11976 ft: 13953 corp: 9/287b lim: 100 exec/s: 0 rss: 69Mb L: 43/43 MS: 1 CMP- DE: "k`\343\370\014\375v\000"- 00:08:33.861 [2024-04-25 20:55:49.366914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.366942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.861 #29 NEW cov: 11976 ft: 14030 corp: 10/320b lim: 100 exec/s: 0 rss: 69Mb L: 33/43 MS: 1 ChangeByte- 00:08:33.861 [2024-04-25 20:55:49.406968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17873939704381661411 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.406998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.861 #30 NEW cov: 11976 ft: 14071 corp: 11/353b lim: 100 exec/s: 0 rss: 69Mb L: 33/43 MS: 1 PersAutoDict- DE: "k`\343\370\014\375v\000"- 00:08:33.861 [2024-04-25 20:55:49.447396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:10995284049920 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.447426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.861 [2024-04-25 20:55:49.447493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:150323855360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.447513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.861 [2024-04-25 20:55:49.447578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:8961 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.447596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.861 #31 NEW cov: 11976 ft: 14411 corp: 12/420b lim: 100 exec/s: 0 rss: 69Mb L: 67/67 MS: 1 CrossOver- 00:08:33.861 [2024-04-25 20:55:49.487290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.861 [2024-04-25 20:55:49.487320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.861 #32 NEW cov: 11976 ft: 14419 corp: 13/443b lim: 100 exec/s: 0 rss: 69Mb L: 23/67 MS: 1 EraseBytes- 00:08:34.120 [2024-04-25 20:55:49.527384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.120 [2024-04-25 20:55:49.527412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.120 #33 NEW cov: 11976 ft: 14531 corp: 14/478b lim: 100 exec/s: 0 rss: 69Mb L: 35/67 MS: 1 ShuffleBytes- 00:08:34.120 [2024-04-25 20:55:49.567499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.120 [2024-04-25 20:55:49.567527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.120 #34 NEW cov: 11976 ft: 14564 corp: 15/513b lim: 100 exec/s: 0 rss: 69Mb L: 35/67 MS: 1 ChangeByte- 00:08:34.120 [2024-04-25 20:55:49.607535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:176291840 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.120 [2024-04-25 20:55:49.607563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.120 #35 NEW cov: 11976 ft: 14588 corp: 16/549b lim: 100 exec/s: 0 rss: 70Mb L: 36/67 MS: 1 InsertByte- 00:08:34.121 [2024-04-25 20:55:49.637805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.121 [2024-04-25 20:55:49.637833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.121 [2024-04-25 20:55:49.637918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:587202560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.121 [2024-04-25 20:55:49.637940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.121 NEW_FUNC[1/1]: 0x19e6b30 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.121 #36 NEW cov: 11999 ft: 14620 corp: 17/592b lim: 100 exec/s: 0 rss: 70Mb L: 43/67 MS: 1 PersAutoDict- DE: "k`\343\370\014\375v\000"- 00:08:34.121 [2024-04-25 20:55:49.677774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.121 [2024-04-25 20:55:49.677802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.121 #37 NEW cov: 11999 ft: 14679 corp: 18/627b lim: 100 exec/s: 0 rss: 70Mb L: 35/67 MS: 1 ChangeBinInt- 00:08:34.121 [2024-04-25 20:55:49.717982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:71226363254366435 len:63489 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.121 [2024-04-25 20:55:49.718019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.121 #38 NEW cov: 11999 ft: 14711 corp: 19/660b lim: 100 exec/s: 0 rss: 70Mb L: 33/67 MS: 1 ShuffleBytes- 00:08:34.121 [2024-04-25 20:55:49.758212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16426893925727365984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.121 [2024-04-25 20:55:49.758242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.121 [2024-04-25 20:55:49.758310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.121 [2024-04-25 20:55:49.758330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.380 #39 NEW cov: 11999 ft: 14737 corp: 20/703b lim: 100 exec/s: 39 rss: 70Mb L: 43/67 MS: 1 ChangeBinInt- 00:08:34.380 [2024-04-25 20:55:49.808183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.808212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.380 #40 NEW cov: 11999 ft: 14805 corp: 21/730b lim: 100 exec/s: 40 rss: 70Mb L: 27/67 MS: 1 EraseBytes- 00:08:34.380 [2024-04-25 20:55:49.848251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:180556398592 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.848278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.380 #41 NEW cov: 11999 ft: 14816 corp: 22/765b lim: 100 exec/s: 41 rss: 70Mb L: 35/67 MS: 1 ChangeByte- 00:08:34.380 [2024-04-25 20:55:49.888593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4398214283264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.888621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.380 [2024-04-25 20:55:49.888716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:587202560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.888738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.380 #42 NEW cov: 11999 ft: 14826 corp: 23/808b lim: 100 exec/s: 42 rss: 70Mb L: 43/67 MS: 1 ChangeBit- 00:08:34.380 [2024-04-25 20:55:49.928540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.928568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.380 #43 NEW cov: 11999 ft: 14859 corp: 24/843b lim: 100 exec/s: 43 rss: 70Mb L: 35/67 MS: 1 ChangeBinInt- 00:08:34.380 [2024-04-25 20:55:49.968799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4398214283264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.968827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.380 [2024-04-25 20:55:49.968897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:587202560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:49.968921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.380 #44 NEW cov: 11999 ft: 14896 corp: 25/886b lim: 100 exec/s: 44 rss: 70Mb L: 43/67 MS: 1 ChangeBit- 00:08:34.380 [2024-04-25 20:55:50.009555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:39 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.380 [2024-04-25 20:55:50.009588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.380 #50 NEW cov: 11999 ft: 14898 corp: 26/921b lim: 100 exec/s: 50 rss: 70Mb L: 35/67 MS: 1 CopyPart- 00:08:34.639 [2024-04-25 20:55:50.048825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.048856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 #51 NEW cov: 11999 ft: 14905 corp: 27/948b lim: 100 exec/s: 51 rss: 70Mb L: 27/67 MS: 1 EraseBytes- 00:08:34.639 [2024-04-25 20:55:50.089343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:30224350749655050 len:64887 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.089372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 [2024-04-25 20:55:50.089439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:36 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.089461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.639 [2024-04-25 20:55:50.089531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4653056 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.089553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.639 #52 NEW cov: 11999 ft: 14920 corp: 28/1014b lim: 100 exec/s: 52 rss: 70Mb L: 66/67 MS: 1 CrossOver- 00:08:34.639 [2024-04-25 20:55:50.129160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.129189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 #53 NEW cov: 11999 ft: 14931 corp: 29/1049b lim: 100 exec/s: 53 rss: 70Mb L: 35/67 MS: 1 CopyPart- 00:08:34.639 [2024-04-25 20:55:50.169256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:532743716864 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.169284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 #54 NEW cov: 11999 ft: 14951 corp: 30/1085b lim: 100 exec/s: 54 rss: 70Mb L: 36/67 MS: 1 InsertByte- 00:08:34.639 [2024-04-25 20:55:50.209522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4398214283264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.209551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 [2024-04-25 20:55:50.209593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:587202560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.209608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.639 #55 NEW cov: 11999 ft: 14972 corp: 31/1128b lim: 100 exec/s: 55 rss: 70Mb L: 43/67 MS: 1 CMP- DE: " \000\000\000"- 00:08:34.639 [2024-04-25 20:55:50.259846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:59368 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.259873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 [2024-04-25 20:55:50.259910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16710579925595711463 len:59368 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.259926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.639 [2024-04-25 20:55:50.259982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3890734848 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.260001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.639 #56 NEW cov: 12002 ft: 15142 corp: 32/1196b lim: 100 exec/s: 56 rss: 70Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:08:34.639 [2024-04-25 20:55:50.299776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4398214283264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.299803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.639 [2024-04-25 20:55:50.299855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:587202560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.639 [2024-04-25 20:55:50.299872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.898 #57 NEW cov: 12002 ft: 15154 corp: 33/1243b lim: 100 exec/s: 57 rss: 70Mb L: 47/68 MS: 1 CMP- DE: "\000\000\000\002"- 00:08:34.898 [2024-04-25 20:55:50.339687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.339714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.898 #58 NEW cov: 12002 ft: 15176 corp: 34/1278b lim: 100 exec/s: 58 rss: 70Mb L: 35/68 MS: 1 ShuffleBytes- 00:08:34.898 [2024-04-25 20:55:50.379812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.379838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.898 #59 NEW cov: 12002 ft: 15188 corp: 35/1305b lim: 100 exec/s: 59 rss: 70Mb L: 27/68 MS: 1 ChangeBit- 00:08:34.898 [2024-04-25 20:55:50.420089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16426893921767943008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.420115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.898 [2024-04-25 20:55:50.420168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.420185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.898 #60 NEW cov: 12002 ft: 15194 corp: 36/1349b lim: 100 exec/s: 60 rss: 70Mb L: 44/68 MS: 1 InsertByte- 00:08:34.898 [2024-04-25 20:55:50.460062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184483840 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.460089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.898 #61 NEW cov: 12002 ft: 15252 corp: 37/1384b lim: 100 exec/s: 61 rss: 70Mb L: 35/68 MS: 1 ChangeByte- 00:08:34.898 [2024-04-25 20:55:50.500358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.500385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.898 [2024-04-25 20:55:50.500441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744069414584575 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.500456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.898 #62 NEW cov: 12002 ft: 15261 corp: 38/1427b lim: 100 exec/s: 62 rss: 70Mb L: 43/68 MS: 1 InsertRepeatedBytes- 00:08:34.898 [2024-04-25 20:55:50.550323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17873939704381661411 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.898 [2024-04-25 20:55:50.550351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.158 #63 NEW cov: 12002 ft: 15265 corp: 39/1465b lim: 100 exec/s: 63 rss: 70Mb L: 38/68 MS: 1 CopyPart- 00:08:35.158 [2024-04-25 20:55:50.590545] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:180556398592 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.590573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.158 [2024-04-25 20:55:50.590615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:11258999068426240 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.590630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.158 #64 NEW cov: 12002 ft: 15291 corp: 40/1508b lim: 100 exec/s: 64 rss: 70Mb L: 43/68 MS: 1 PersAutoDict- DE: "k`\343\370\014\375v\000"- 00:08:35.158 [2024-04-25 20:55:50.640687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16426893921767943008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.640715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.158 [2024-04-25 20:55:50.640767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:150323855360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.640783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.158 #65 NEW cov: 12002 ft: 15302 corp: 41/1551b lim: 100 exec/s: 65 rss: 70Mb L: 43/68 MS: 1 ShuffleBytes- 00:08:35.158 [2024-04-25 20:55:50.680886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:176291840 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.680913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.158 [2024-04-25 20:55:50.680958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.680974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.158 #66 NEW cov: 12002 ft: 15310 corp: 42/1592b lim: 100 exec/s: 66 rss: 70Mb L: 41/68 MS: 1 CrossOver- 00:08:35.158 [2024-04-25 20:55:50.730844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.730872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.158 #67 NEW cov: 12002 ft: 15316 corp: 43/1618b lim: 100 exec/s: 67 rss: 70Mb L: 26/68 MS: 1 EraseBytes- 00:08:35.158 [2024-04-25 20:55:50.771325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17149718376310898688 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.771352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.158 [2024-04-25 20:55:50.771392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:150323855360 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.771408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.158 [2024-04-25 20:55:50.771465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:8961 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.158 [2024-04-25 20:55:50.771480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.158 #68 NEW cov: 12002 ft: 15325 corp: 44/1685b lim: 100 exec/s: 34 rss: 70Mb L: 67/68 MS: 1 ChangeByte- 00:08:35.158 #68 DONE cov: 12002 ft: 15325 corp: 44/1685b lim: 100 exec/s: 34 rss: 70Mb 00:08:35.158 ###### Recommended dictionary. ###### 00:08:35.158 "k`\343\370\014\375v\000" # Uses: 3 00:08:35.158 " \000\000\000" # Uses: 0 00:08:35.158 "\000\000\000\002" # Uses: 0 00:08:35.159 ###### End of recommended dictionary. ###### 00:08:35.159 Done 68 runs in 2 second(s) 00:08:35.418 20:55:50 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.418 20:55:50 -- ../common.sh@72 -- # (( i++ )) 00:08:35.418 20:55:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.418 20:55:50 -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:35.418 00:08:35.418 real 1m2.066s 00:08:35.418 user 1m38.962s 00:08:35.418 sys 0m6.627s 00:08:35.418 20:55:50 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:35.418 20:55:50 -- common/autotest_common.sh@10 -- # set +x 00:08:35.418 ************************************ 00:08:35.418 END TEST nvmf_fuzz 00:08:35.418 ************************************ 00:08:35.418 20:55:50 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:35.418 20:55:50 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:35.418 20:55:50 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:35.418 20:55:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:35.418 20:55:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:35.418 20:55:50 -- common/autotest_common.sh@10 -- # set +x 00:08:35.679 ************************************ 00:08:35.679 START TEST vfio_fuzz 00:08:35.679 ************************************ 00:08:35.679 20:55:51 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:35.679 * Looking for test storage... 00:08:35.679 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.679 20:55:51 -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:35.679 20:55:51 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:35.679 20:55:51 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:35.679 20:55:51 -- common/autotest_common.sh@34 -- # set -e 00:08:35.679 20:55:51 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:35.679 20:55:51 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:35.679 20:55:51 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:35.679 20:55:51 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:35.679 20:55:51 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:35.679 20:55:51 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:35.679 20:55:51 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:35.679 20:55:51 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:35.679 20:55:51 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:35.679 20:55:51 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:35.679 20:55:51 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:35.679 20:55:51 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:35.679 20:55:51 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:35.679 20:55:51 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:35.679 20:55:51 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:35.679 20:55:51 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:35.679 20:55:51 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:35.679 20:55:51 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:35.679 20:55:51 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:35.679 20:55:51 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:35.679 20:55:51 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:35.679 20:55:51 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:35.679 20:55:51 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:35.679 20:55:51 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:35.679 20:55:51 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:35.679 20:55:51 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:35.679 20:55:51 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:35.679 20:55:51 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:35.679 20:55:51 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:35.679 20:55:51 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:35.679 20:55:51 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:35.679 20:55:51 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:35.679 20:55:51 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:35.679 20:55:51 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:35.679 20:55:51 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:35.679 20:55:51 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:35.679 20:55:51 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:35.679 20:55:51 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:35.679 20:55:51 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:35.679 20:55:51 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:35.679 20:55:51 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:35.679 20:55:51 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:35.679 20:55:51 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:35.679 20:55:51 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:35.679 20:55:51 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:35.679 20:55:51 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:35.679 20:55:51 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:35.679 20:55:51 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:35.679 20:55:51 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:35.679 20:55:51 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:35.679 20:55:51 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:35.679 20:55:51 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:35.679 20:55:51 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:35.679 20:55:51 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:35.679 20:55:51 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:35.679 20:55:51 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:35.679 20:55:51 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:35.679 20:55:51 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:08:35.679 20:55:51 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:08:35.679 20:55:51 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:08:35.679 20:55:51 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:08:35.679 20:55:51 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:08:35.679 20:55:51 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:08:35.679 20:55:51 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:08:35.679 20:55:51 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:08:35.679 20:55:51 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:08:35.679 20:55:51 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.679 20:55:51 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:08:35.679 20:55:51 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:08:35.679 20:55:51 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:08:35.679 20:55:51 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:08:35.679 20:55:51 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:08:35.679 20:55:51 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:35.679 20:55:51 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:08:35.679 20:55:51 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:08:35.679 20:55:51 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:08:35.679 20:55:51 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:08:35.679 20:55:51 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:08:35.679 20:55:51 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:08:35.679 20:55:51 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:08:35.679 20:55:51 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:08:35.679 20:55:51 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:08:35.679 20:55:51 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:08:35.679 20:55:51 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:08:35.679 20:55:51 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:35.679 20:55:51 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:08:35.679 20:55:51 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:08:35.679 20:55:51 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:35.679 20:55:51 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:35.679 20:55:51 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:35.679 20:55:51 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:35.679 20:55:51 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:35.679 20:55:51 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.679 20:55:51 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:35.679 20:55:51 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.679 20:55:51 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:35.679 20:55:51 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:35.679 20:55:51 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:35.679 20:55:51 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:35.679 20:55:51 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:35.679 20:55:51 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:35.679 20:55:51 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:35.679 20:55:51 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:35.679 #define SPDK_CONFIG_H 00:08:35.679 #define SPDK_CONFIG_APPS 1 00:08:35.679 #define SPDK_CONFIG_ARCH native 00:08:35.679 #undef SPDK_CONFIG_ASAN 00:08:35.679 #undef SPDK_CONFIG_AVAHI 00:08:35.679 #undef SPDK_CONFIG_CET 00:08:35.680 #define SPDK_CONFIG_COVERAGE 1 00:08:35.680 #define SPDK_CONFIG_CROSS_PREFIX 00:08:35.680 #undef SPDK_CONFIG_CRYPTO 00:08:35.680 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:35.680 #undef SPDK_CONFIG_CUSTOMOCF 00:08:35.680 #undef SPDK_CONFIG_DAOS 00:08:35.680 #define SPDK_CONFIG_DAOS_DIR 00:08:35.680 #define SPDK_CONFIG_DEBUG 1 00:08:35.680 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:35.680 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:35.680 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:35.680 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.680 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:35.680 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:35.680 #define SPDK_CONFIG_EXAMPLES 1 00:08:35.680 #undef SPDK_CONFIG_FC 00:08:35.680 #define SPDK_CONFIG_FC_PATH 00:08:35.680 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:35.680 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:35.680 #undef SPDK_CONFIG_FUSE 00:08:35.680 #define SPDK_CONFIG_FUZZER 1 00:08:35.680 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:35.680 #undef SPDK_CONFIG_GOLANG 00:08:35.680 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:35.680 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:35.680 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:35.680 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:08:35.680 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:35.680 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:35.680 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:35.680 #define SPDK_CONFIG_IDXD 1 00:08:35.680 #undef SPDK_CONFIG_IDXD_KERNEL 00:08:35.680 #undef SPDK_CONFIG_IPSEC_MB 00:08:35.680 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:35.680 #define SPDK_CONFIG_ISAL 1 00:08:35.680 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:35.680 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:35.680 #define SPDK_CONFIG_LIBDIR 00:08:35.680 #undef SPDK_CONFIG_LTO 00:08:35.680 #define SPDK_CONFIG_MAX_LCORES 00:08:35.680 #define SPDK_CONFIG_NVME_CUSE 1 00:08:35.680 #undef SPDK_CONFIG_OCF 00:08:35.680 #define SPDK_CONFIG_OCF_PATH 00:08:35.680 #define SPDK_CONFIG_OPENSSL_PATH 00:08:35.680 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:35.680 #define SPDK_CONFIG_PGO_DIR 00:08:35.680 #undef SPDK_CONFIG_PGO_USE 00:08:35.680 #define SPDK_CONFIG_PREFIX /usr/local 00:08:35.680 #undef SPDK_CONFIG_RAID5F 00:08:35.680 #undef SPDK_CONFIG_RBD 00:08:35.680 #define SPDK_CONFIG_RDMA 1 00:08:35.680 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:35.680 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:35.680 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:35.680 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:35.680 #undef SPDK_CONFIG_SHARED 00:08:35.680 #undef SPDK_CONFIG_SMA 00:08:35.680 #define SPDK_CONFIG_TESTS 1 00:08:35.680 #undef SPDK_CONFIG_TSAN 00:08:35.680 #define SPDK_CONFIG_UBLK 1 00:08:35.680 #define SPDK_CONFIG_UBSAN 1 00:08:35.680 #undef SPDK_CONFIG_UNIT_TESTS 00:08:35.680 #undef SPDK_CONFIG_URING 00:08:35.680 #define SPDK_CONFIG_URING_PATH 00:08:35.680 #undef SPDK_CONFIG_URING_ZNS 00:08:35.680 #undef SPDK_CONFIG_USDT 00:08:35.680 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:35.680 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:35.680 #define SPDK_CONFIG_VFIO_USER 1 00:08:35.680 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:35.680 #define SPDK_CONFIG_VHOST 1 00:08:35.680 #define SPDK_CONFIG_VIRTIO 1 00:08:35.680 #undef SPDK_CONFIG_VTUNE 00:08:35.680 #define SPDK_CONFIG_VTUNE_DIR 00:08:35.680 #define SPDK_CONFIG_WERROR 1 00:08:35.680 #define SPDK_CONFIG_WPDK_DIR 00:08:35.680 #undef SPDK_CONFIG_XNVME 00:08:35.680 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:35.680 20:55:51 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:35.680 20:55:51 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:35.680 20:55:51 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:35.680 20:55:51 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:35.680 20:55:51 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:35.680 20:55:51 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.680 20:55:51 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.680 20:55:51 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.680 20:55:51 -- paths/export.sh@5 -- # export PATH 00:08:35.680 20:55:51 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.680 20:55:51 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:35.680 20:55:51 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:35.680 20:55:51 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:35.680 20:55:51 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:35.680 20:55:51 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:35.680 20:55:51 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:35.680 20:55:51 -- pm/common@67 -- # TEST_TAG=N/A 00:08:35.680 20:55:51 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:35.680 20:55:51 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:35.680 20:55:51 -- pm/common@71 -- # uname -s 00:08:35.680 20:55:51 -- pm/common@71 -- # PM_OS=Linux 00:08:35.680 20:55:51 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:35.680 20:55:51 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:08:35.680 20:55:51 -- pm/common@76 -- # [[ Linux == Linux ]] 00:08:35.680 20:55:51 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:08:35.680 20:55:51 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:08:35.680 20:55:51 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:35.680 20:55:51 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:35.680 20:55:51 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:08:35.680 20:55:51 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:08:35.680 20:55:51 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:35.680 20:55:51 -- common/autotest_common.sh@57 -- # : 1 00:08:35.680 20:55:51 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:08:35.680 20:55:51 -- common/autotest_common.sh@61 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:35.680 20:55:51 -- common/autotest_common.sh@63 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:08:35.680 20:55:51 -- common/autotest_common.sh@65 -- # : 1 00:08:35.680 20:55:51 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:35.680 20:55:51 -- common/autotest_common.sh@67 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:08:35.680 20:55:51 -- common/autotest_common.sh@69 -- # : 00:08:35.680 20:55:51 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:08:35.680 20:55:51 -- common/autotest_common.sh@71 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:08:35.680 20:55:51 -- common/autotest_common.sh@73 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:08:35.680 20:55:51 -- common/autotest_common.sh@75 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:08:35.680 20:55:51 -- common/autotest_common.sh@77 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:35.680 20:55:51 -- common/autotest_common.sh@79 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:08:35.680 20:55:51 -- common/autotest_common.sh@81 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:08:35.680 20:55:51 -- common/autotest_common.sh@83 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:08:35.680 20:55:51 -- common/autotest_common.sh@85 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:08:35.680 20:55:51 -- common/autotest_common.sh@87 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:08:35.680 20:55:51 -- common/autotest_common.sh@89 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:08:35.680 20:55:51 -- common/autotest_common.sh@91 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:08:35.680 20:55:51 -- common/autotest_common.sh@93 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:08:35.680 20:55:51 -- common/autotest_common.sh@95 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:35.680 20:55:51 -- common/autotest_common.sh@97 -- # : 1 00:08:35.680 20:55:51 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:08:35.680 20:55:51 -- common/autotest_common.sh@99 -- # : 1 00:08:35.680 20:55:51 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:08:35.680 20:55:51 -- common/autotest_common.sh@101 -- # : rdma 00:08:35.680 20:55:51 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:35.680 20:55:51 -- common/autotest_common.sh@103 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:08:35.680 20:55:51 -- common/autotest_common.sh@105 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:08:35.680 20:55:51 -- common/autotest_common.sh@107 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:08:35.680 20:55:51 -- common/autotest_common.sh@109 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:08:35.680 20:55:51 -- common/autotest_common.sh@111 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:08:35.680 20:55:51 -- common/autotest_common.sh@113 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:08:35.680 20:55:51 -- common/autotest_common.sh@115 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:08:35.680 20:55:51 -- common/autotest_common.sh@117 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:35.680 20:55:51 -- common/autotest_common.sh@119 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:08:35.680 20:55:51 -- common/autotest_common.sh@121 -- # : 1 00:08:35.680 20:55:51 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:08:35.680 20:55:51 -- common/autotest_common.sh@123 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:35.680 20:55:51 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:35.680 20:55:51 -- common/autotest_common.sh@125 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:08:35.680 20:55:51 -- common/autotest_common.sh@127 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:08:35.680 20:55:51 -- common/autotest_common.sh@129 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:08:35.680 20:55:51 -- common/autotest_common.sh@131 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:08:35.680 20:55:51 -- common/autotest_common.sh@133 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:08:35.680 20:55:51 -- common/autotest_common.sh@135 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:08:35.680 20:55:51 -- common/autotest_common.sh@137 -- # : main 00:08:35.680 20:55:51 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:08:35.680 20:55:51 -- common/autotest_common.sh@139 -- # : true 00:08:35.680 20:55:51 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:08:35.680 20:55:51 -- common/autotest_common.sh@141 -- # : 0 00:08:35.680 20:55:51 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:08:35.680 20:55:51 -- common/autotest_common.sh@143 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:08:35.681 20:55:51 -- common/autotest_common.sh@145 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:08:35.681 20:55:51 -- common/autotest_common.sh@147 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:08:35.681 20:55:51 -- common/autotest_common.sh@149 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:08:35.681 20:55:51 -- common/autotest_common.sh@151 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:08:35.681 20:55:51 -- common/autotest_common.sh@153 -- # : 00:08:35.681 20:55:51 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:08:35.681 20:55:51 -- common/autotest_common.sh@155 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:08:35.681 20:55:51 -- common/autotest_common.sh@157 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:08:35.681 20:55:51 -- common/autotest_common.sh@159 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:08:35.681 20:55:51 -- common/autotest_common.sh@161 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:08:35.681 20:55:51 -- common/autotest_common.sh@163 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:08:35.681 20:55:51 -- common/autotest_common.sh@166 -- # : 00:08:35.681 20:55:51 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:08:35.681 20:55:51 -- common/autotest_common.sh@168 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:08:35.681 20:55:51 -- common/autotest_common.sh@170 -- # : 0 00:08:35.681 20:55:51 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:35.681 20:55:51 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.681 20:55:51 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:35.681 20:55:51 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:35.681 20:55:51 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:35.681 20:55:51 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:35.681 20:55:51 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:35.681 20:55:51 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:08:35.681 20:55:51 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:35.681 20:55:51 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:35.681 20:55:51 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:35.681 20:55:51 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:35.681 20:55:51 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:35.681 20:55:51 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:08:35.681 20:55:51 -- common/autotest_common.sh@199 -- # cat 00:08:35.681 20:55:51 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:08:35.681 20:55:51 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:35.681 20:55:51 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:35.681 20:55:51 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:35.681 20:55:51 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:35.681 20:55:51 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:08:35.681 20:55:51 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:08:35.681 20:55:51 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.681 20:55:51 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.681 20:55:51 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.681 20:55:51 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.681 20:55:51 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:35.681 20:55:51 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:35.681 20:55:51 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:35.681 20:55:51 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:35.681 20:55:51 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:35.681 20:55:51 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:35.681 20:55:51 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:35.681 20:55:51 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:35.681 20:55:51 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:08:35.681 20:55:51 -- common/autotest_common.sh@252 -- # export valgrind= 00:08:35.681 20:55:51 -- common/autotest_common.sh@252 -- # valgrind= 00:08:35.681 20:55:51 -- common/autotest_common.sh@258 -- # uname -s 00:08:35.681 20:55:51 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:08:35.681 20:55:51 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:08:35.681 20:55:51 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:08:35.681 20:55:51 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:08:35.681 20:55:51 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:08:35.681 20:55:51 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:08:35.681 20:55:51 -- common/autotest_common.sh@268 -- # MAKE=make 00:08:35.681 20:55:51 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j112 00:08:35.681 20:55:51 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:08:35.681 20:55:51 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:08:35.681 20:55:51 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:08:35.681 20:55:51 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:08:35.681 20:55:51 -- common/autotest_common.sh@307 -- # [[ -z 203281 ]] 00:08:35.681 20:55:51 -- common/autotest_common.sh@307 -- # kill -0 203281 00:08:35.681 20:55:51 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:35.681 20:55:51 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:08:35.681 20:55:51 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:08:35.681 20:55:51 -- common/autotest_common.sh@320 -- # local mount target_dir 00:08:35.681 20:55:51 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:08:35.681 20:55:51 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:08:35.681 20:55:51 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:08:35.681 20:55:51 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:08:35.941 20:55:51 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.Ssn9JO 00:08:35.941 20:55:51 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:35.941 20:55:51 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:08:35.941 20:55:51 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:08:35.941 20:55:51 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.Ssn9JO/tests/vfio /tmp/spdk.Ssn9JO 00:08:35.941 20:55:51 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.941 20:55:51 -- common/autotest_common.sh@316 -- # df -T 00:08:35.941 20:55:51 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:08:35.941 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=1052192768 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:08:35.941 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=4232237056 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=52735700992 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=61742305280 00:08:35.941 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=9006604288 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=30868537344 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871150592 00:08:35.941 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=2613248 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=12342480896 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=12348461056 00:08:35.941 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=5980160 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=30870736896 00:08:35.941 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=30871154688 00:08:35.941 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=417792 00:08:35.941 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.942 20:55:51 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:35.942 20:55:51 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:35.942 20:55:51 -- common/autotest_common.sh@351 -- # avails["$mount"]=6174224384 00:08:35.942 20:55:51 -- common/autotest_common.sh@351 -- # sizes["$mount"]=6174228480 00:08:35.942 20:55:51 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:08:35.942 20:55:51 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:35.942 20:55:51 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:08:35.942 * Looking for test storage... 00:08:35.942 20:55:51 -- common/autotest_common.sh@357 -- # local target_space new_size 00:08:35.942 20:55:51 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:08:35.942 20:55:51 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.942 20:55:51 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:35.942 20:55:51 -- common/autotest_common.sh@361 -- # mount=/ 00:08:35.942 20:55:51 -- common/autotest_common.sh@363 -- # target_space=52735700992 00:08:35.942 20:55:51 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:08:35.942 20:55:51 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:08:35.942 20:55:51 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:08:35.942 20:55:51 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:08:35.942 20:55:51 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:08:35.942 20:55:51 -- common/autotest_common.sh@370 -- # new_size=11221196800 00:08:35.942 20:55:51 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:35.942 20:55:51 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.942 20:55:51 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.942 20:55:51 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.942 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.942 20:55:51 -- common/autotest_common.sh@378 -- # return 0 00:08:35.942 20:55:51 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:35.942 20:55:51 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:35.942 20:55:51 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:35.942 20:55:51 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:35.942 20:55:51 -- common/autotest_common.sh@1673 -- # true 00:08:35.942 20:55:51 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:35.942 20:55:51 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:35.942 20:55:51 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:35.942 20:55:51 -- common/autotest_common.sh@27 -- # exec 00:08:35.942 20:55:51 -- common/autotest_common.sh@29 -- # exec 00:08:35.942 20:55:51 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:35.942 20:55:51 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:35.942 20:55:51 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:35.942 20:55:51 -- common/autotest_common.sh@18 -- # set -x 00:08:35.942 20:55:51 -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:35.942 20:55:51 -- ../common.sh@8 -- # pids=() 00:08:35.942 20:55:51 -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.942 20:55:51 -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.942 20:55:51 -- vfio/run.sh@68 -- # fuzz_num=7 00:08:35.942 20:55:51 -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:35.942 20:55:51 -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:35.942 20:55:51 -- vfio/run.sh@74 -- # mem_size=0 00:08:35.942 20:55:51 -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:35.942 20:55:51 -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:35.942 20:55:51 -- ../common.sh@69 -- # local fuzz_num=7 00:08:35.942 20:55:51 -- ../common.sh@70 -- # local time=1 00:08:35.942 20:55:51 -- ../common.sh@72 -- # (( i = 0 )) 00:08:35.942 20:55:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.942 20:55:51 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:35.942 20:55:51 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:35.942 20:55:51 -- vfio/run.sh@23 -- # local timen=1 00:08:35.942 20:55:51 -- vfio/run.sh@24 -- # local core=0x1 00:08:35.942 20:55:51 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.942 20:55:51 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:35.942 20:55:51 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:35.942 20:55:51 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:35.942 20:55:51 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:35.942 20:55:51 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:35.942 20:55:51 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:35.942 20:55:51 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.942 20:55:51 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:35.942 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:35.942 20:55:51 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:35.942 20:55:51 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:35.942 20:55:51 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:35.942 [2024-04-25 20:55:51.432129] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:35.942 [2024-04-25 20:55:51.432197] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid203443 ] 00:08:35.942 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.942 [2024-04-25 20:55:51.466396] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:35.942 [2024-04-25 20:55:51.501554] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.942 [2024-04-25 20:55:51.537350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.201 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.201 INFO: Seed: 377367558 00:08:36.201 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:36.201 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:36.201 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.201 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.201 #2 INITED exec/s: 0 rss: 62Mb 00:08:36.201 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.201 This may also happen if the target rejected all inputs we tried so far 00:08:36.201 [2024-04-25 20:55:51.768022] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:36.718 NEW_FUNC[1/635]: 0x4a3680 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:36.719 NEW_FUNC[2/635]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:36.719 #20 NEW cov: 10824 ft: 10671 corp: 2/7b lim: 6 exec/s: 0 rss: 68Mb L: 6/6 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:36.978 #21 NEW cov: 10847 ft: 13653 corp: 3/13b lim: 6 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 ChangeASCIIInt- 00:08:36.978 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.978 #23 NEW cov: 10871 ft: 15368 corp: 4/19b lim: 6 exec/s: 0 rss: 70Mb L: 6/6 MS: 2 CopyPart-CrossOver- 00:08:37.236 #24 NEW cov: 10871 ft: 16585 corp: 5/25b lim: 6 exec/s: 24 rss: 70Mb L: 6/6 MS: 1 ChangeBit- 00:08:37.495 #25 NEW cov: 10871 ft: 16732 corp: 6/31b lim: 6 exec/s: 25 rss: 70Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:37.753 #26 NEW cov: 10873 ft: 16777 corp: 7/37b lim: 6 exec/s: 26 rss: 70Mb L: 6/6 MS: 1 CrossOver- 00:08:37.753 #27 NEW cov: 10873 ft: 17226 corp: 8/43b lim: 6 exec/s: 27 rss: 70Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:38.012 #28 NEW cov: 10880 ft: 17488 corp: 9/49b lim: 6 exec/s: 28 rss: 70Mb L: 6/6 MS: 1 ChangeByte- 00:08:38.271 #41 NEW cov: 10880 ft: 17676 corp: 10/55b lim: 6 exec/s: 20 rss: 70Mb L: 6/6 MS: 3 EraseBytes-CrossOver-CopyPart- 00:08:38.271 #41 DONE cov: 10880 ft: 17676 corp: 10/55b lim: 6 exec/s: 20 rss: 70Mb 00:08:38.271 Done 41 runs in 2 second(s) 00:08:38.271 [2024-04-25 20:55:53.789179] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:38.530 20:55:54 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:38.530 20:55:54 -- ../common.sh@72 -- # (( i++ )) 00:08:38.530 20:55:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.530 20:55:54 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:38.530 20:55:54 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:38.530 20:55:54 -- vfio/run.sh@23 -- # local timen=1 00:08:38.530 20:55:54 -- vfio/run.sh@24 -- # local core=0x1 00:08:38.530 20:55:54 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.530 20:55:54 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:38.530 20:55:54 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:38.530 20:55:54 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:38.530 20:55:54 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:38.530 20:55:54 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:38.530 20:55:54 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:38.530 20:55:54 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.530 20:55:54 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:38.530 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:38.530 20:55:54 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:38.530 20:55:54 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:38.530 20:55:54 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:38.530 [2024-04-25 20:55:54.066651] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:38.530 [2024-04-25 20:55:54.066747] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid203766 ] 00:08:38.530 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.530 [2024-04-25 20:55:54.104120] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:38.530 [2024-04-25 20:55:54.141129] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.530 [2024-04-25 20:55:54.178316] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.789 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.789 INFO: Seed: 3030380030 00:08:38.789 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:38.789 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:38.789 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.789 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.789 #2 INITED exec/s: 0 rss: 63Mb 00:08:38.789 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.789 This may also happen if the target rejected all inputs we tried so far 00:08:38.789 [2024-04-25 20:55:54.423455] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:39.048 [2024-04-25 20:55:54.507902] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.048 [2024-04-25 20:55:54.507928] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.048 [2024-04-25 20:55:54.507947] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.306 NEW_FUNC[1/636]: 0x4a3c20 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:39.306 NEW_FUNC[2/636]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:39.306 #17 NEW cov: 10812 ft: 10795 corp: 2/5b lim: 4 exec/s: 0 rss: 68Mb L: 4/4 MS: 5 ChangeBit-ShuffleBytes-CopyPart-CrossOver-CrossOver- 00:08:39.564 [2024-04-25 20:55:55.016123] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.564 [2024-04-25 20:55:55.016154] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.564 [2024-04-25 20:55:55.016173] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.564 NEW_FUNC[1/1]: 0x1375b90 in nvmf_vfio_user_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5751 00:08:39.564 #18 NEW cov: 10846 ft: 14124 corp: 3/9b lim: 4 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 ChangeByte- 00:08:39.822 [2024-04-25 20:55:55.227512] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.822 [2024-04-25 20:55:55.227536] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.822 [2024-04-25 20:55:55.227554] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.822 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.822 #24 NEW cov: 10863 ft: 15059 corp: 4/13b lim: 4 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 CrossOver- 00:08:39.822 [2024-04-25 20:55:55.425494] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.822 [2024-04-25 20:55:55.425525] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.822 [2024-04-25 20:55:55.425542] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.080 #25 NEW cov: 10863 ft: 16134 corp: 5/17b lim: 4 exec/s: 25 rss: 70Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:40.080 [2024-04-25 20:55:55.627938] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.080 [2024-04-25 20:55:55.627961] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.080 [2024-04-25 20:55:55.627978] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.338 #31 NEW cov: 10863 ft: 16479 corp: 6/21b lim: 4 exec/s: 31 rss: 70Mb L: 4/4 MS: 1 CopyPart- 00:08:40.338 [2024-04-25 20:55:55.833712] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.338 [2024-04-25 20:55:55.833733] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.338 [2024-04-25 20:55:55.833750] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.338 #32 NEW cov: 10863 ft: 16914 corp: 7/25b lim: 4 exec/s: 32 rss: 70Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:40.596 [2024-04-25 20:55:56.033098] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.596 [2024-04-25 20:55:56.033120] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.596 [2024-04-25 20:55:56.033138] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.596 #33 NEW cov: 10863 ft: 17112 corp: 8/29b lim: 4 exec/s: 33 rss: 70Mb L: 4/4 MS: 1 CMP- DE: "\201\000\000\000"- 00:08:40.596 [2024-04-25 20:55:56.238777] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.596 [2024-04-25 20:55:56.238799] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.596 [2024-04-25 20:55:56.238816] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.854 #34 NEW cov: 10870 ft: 17247 corp: 9/33b lim: 4 exec/s: 34 rss: 70Mb L: 4/4 MS: 1 CrossOver- 00:08:40.854 [2024-04-25 20:55:56.436944] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.854 [2024-04-25 20:55:56.436966] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.854 [2024-04-25 20:55:56.436985] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.112 #40 NEW cov: 10870 ft: 17677 corp: 10/37b lim: 4 exec/s: 20 rss: 70Mb L: 4/4 MS: 1 CopyPart- 00:08:41.112 #40 DONE cov: 10870 ft: 17677 corp: 10/37b lim: 4 exec/s: 20 rss: 70Mb 00:08:41.112 ###### Recommended dictionary. ###### 00:08:41.112 "\201\000\000\000" # Uses: 0 00:08:41.112 ###### End of recommended dictionary. ###### 00:08:41.112 Done 40 runs in 2 second(s) 00:08:41.112 [2024-04-25 20:55:56.576181] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:41.371 20:55:56 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:41.371 20:55:56 -- ../common.sh@72 -- # (( i++ )) 00:08:41.371 20:55:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.371 20:55:56 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:41.371 20:55:56 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:41.371 20:55:56 -- vfio/run.sh@23 -- # local timen=1 00:08:41.371 20:55:56 -- vfio/run.sh@24 -- # local core=0x1 00:08:41.371 20:55:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.371 20:55:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:41.371 20:55:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:41.371 20:55:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:41.371 20:55:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:41.371 20:55:56 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:41.371 20:55:56 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:41.371 20:55:56 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.371 20:55:56 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:41.371 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.372 20:55:56 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:41.372 20:55:56 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:41.372 20:55:56 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:41.372 [2024-04-25 20:55:56.848086] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:41.372 [2024-04-25 20:55:56.848158] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid204304 ] 00:08:41.372 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.372 [2024-04-25 20:55:56.883066] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:41.372 [2024-04-25 20:55:56.918939] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.372 [2024-04-25 20:55:56.954885] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.631 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.631 INFO: Seed: 1499397688 00:08:41.631 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:41.631 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:41.631 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.631 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.631 #2 INITED exec/s: 0 rss: 62Mb 00:08:41.631 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.631 This may also happen if the target rejected all inputs we tried so far 00:08:41.631 [2024-04-25 20:55:57.185118] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:41.631 [2024-04-25 20:55:57.248991] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.149 NEW_FUNC[1/636]: 0x4a4600 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:42.149 NEW_FUNC[2/636]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.149 #42 NEW cov: 10812 ft: 10566 corp: 2/9b lim: 8 exec/s: 0 rss: 68Mb L: 8/8 MS: 5 InsertByte-CrossOver-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:42.149 [2024-04-25 20:55:57.713966] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.407 #52 NEW cov: 10826 ft: 13628 corp: 3/17b lim: 8 exec/s: 0 rss: 69Mb L: 8/8 MS: 5 InsertRepeatedBytes-ChangeByte-ChangeByte-ChangeASCIIInt-CopyPart- 00:08:42.407 [2024-04-25 20:55:57.914077] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.407 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:42.407 #53 NEW cov: 10846 ft: 14709 corp: 4/25b lim: 8 exec/s: 0 rss: 70Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:42.666 [2024-04-25 20:55:58.103753] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.666 #54 NEW cov: 10846 ft: 15296 corp: 5/33b lim: 8 exec/s: 54 rss: 70Mb L: 8/8 MS: 1 ChangeASCIIInt- 00:08:42.666 [2024-04-25 20:55:58.295215] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.924 #55 NEW cov: 10846 ft: 15464 corp: 6/41b lim: 8 exec/s: 55 rss: 71Mb L: 8/8 MS: 1 CrossOver- 00:08:42.924 [2024-04-25 20:55:58.483998] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.183 #56 NEW cov: 10846 ft: 15590 corp: 7/49b lim: 8 exec/s: 56 rss: 71Mb L: 8/8 MS: 1 ChangeBit- 00:08:43.183 [2024-04-25 20:55:58.674330] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.183 #62 NEW cov: 10846 ft: 16192 corp: 8/57b lim: 8 exec/s: 62 rss: 71Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:43.442 [2024-04-25 20:55:58.865328] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.442 #68 NEW cov: 10853 ft: 16264 corp: 9/65b lim: 8 exec/s: 68 rss: 71Mb L: 8/8 MS: 1 CopyPart- 00:08:43.442 [2024-04-25 20:55:59.056980] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.701 #74 NEW cov: 10853 ft: 16540 corp: 10/73b lim: 8 exec/s: 37 rss: 71Mb L: 8/8 MS: 1 ChangeBit- 00:08:43.701 #74 DONE cov: 10853 ft: 16540 corp: 10/73b lim: 8 exec/s: 37 rss: 71Mb 00:08:43.701 Done 74 runs in 2 second(s) 00:08:43.701 [2024-04-25 20:55:59.192189] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:43.961 20:55:59 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:43.961 20:55:59 -- ../common.sh@72 -- # (( i++ )) 00:08:43.961 20:55:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.961 20:55:59 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:43.961 20:55:59 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:43.961 20:55:59 -- vfio/run.sh@23 -- # local timen=1 00:08:43.961 20:55:59 -- vfio/run.sh@24 -- # local core=0x1 00:08:43.961 20:55:59 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.961 20:55:59 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:43.961 20:55:59 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:43.961 20:55:59 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:43.961 20:55:59 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:43.961 20:55:59 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:43.961 20:55:59 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:43.961 20:55:59 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.961 20:55:59 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:43.961 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.961 20:55:59 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.961 20:55:59 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:43.961 20:55:59 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:43.961 [2024-04-25 20:55:59.465951] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:43.961 [2024-04-25 20:55:59.466043] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid204839 ] 00:08:43.961 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.961 [2024-04-25 20:55:59.500608] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:43.961 [2024-04-25 20:55:59.537439] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.961 [2024-04-25 20:55:59.573355] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.221 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.221 INFO: Seed: 4118448541 00:08:44.221 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:44.221 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:44.221 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.221 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.221 #2 INITED exec/s: 0 rss: 63Mb 00:08:44.221 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.221 This may also happen if the target rejected all inputs we tried so far 00:08:44.221 [2024-04-25 20:55:59.803109] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:44.738 NEW_FUNC[1/636]: 0x4a4ce0 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:44.738 NEW_FUNC[2/636]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.738 #262 NEW cov: 10768 ft: 10792 corp: 2/33b lim: 32 exec/s: 0 rss: 68Mb L: 32/32 MS: 5 ChangeBinInt-CrossOver-ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:08:44.997 #263 NEW cov: 10837 ft: 13501 corp: 3/65b lim: 32 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:44.997 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.997 #264 NEW cov: 10854 ft: 14838 corp: 4/97b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 ChangeByte- 00:08:45.256 #265 NEW cov: 10854 ft: 15362 corp: 5/129b lim: 32 exec/s: 265 rss: 70Mb L: 32/32 MS: 1 ChangeBit- 00:08:45.514 #266 NEW cov: 10854 ft: 15613 corp: 6/161b lim: 32 exec/s: 266 rss: 70Mb L: 32/32 MS: 1 CrossOver- 00:08:45.773 #267 NEW cov: 10854 ft: 16090 corp: 7/193b lim: 32 exec/s: 267 rss: 70Mb L: 32/32 MS: 1 CopyPart- 00:08:45.773 #268 NEW cov: 10854 ft: 16209 corp: 8/225b lim: 32 exec/s: 268 rss: 70Mb L: 32/32 MS: 1 CrossOver- 00:08:46.032 #269 NEW cov: 10854 ft: 16494 corp: 9/257b lim: 32 exec/s: 269 rss: 70Mb L: 32/32 MS: 1 ChangeByte- 00:08:46.292 #274 NEW cov: 10861 ft: 16612 corp: 10/289b lim: 32 exec/s: 274 rss: 70Mb L: 32/32 MS: 5 EraseBytes-InsertRepeatedBytes-ChangeBinInt-ChangeByte-InsertByte- 00:08:46.550 #280 NEW cov: 10861 ft: 16655 corp: 11/321b lim: 32 exec/s: 140 rss: 70Mb L: 32/32 MS: 1 ChangeBit- 00:08:46.550 #280 DONE cov: 10861 ft: 16655 corp: 11/321b lim: 32 exec/s: 140 rss: 70Mb 00:08:46.550 Done 280 runs in 2 second(s) 00:08:46.550 [2024-04-25 20:56:01.975200] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:46.809 20:56:02 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:46.809 20:56:02 -- ../common.sh@72 -- # (( i++ )) 00:08:46.809 20:56:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.809 20:56:02 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:46.809 20:56:02 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:46.809 20:56:02 -- vfio/run.sh@23 -- # local timen=1 00:08:46.809 20:56:02 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.809 20:56:02 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.809 20:56:02 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:46.809 20:56:02 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:46.809 20:56:02 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:46.809 20:56:02 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:46.809 20:56:02 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:46.809 20:56:02 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:46.809 20:56:02 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.809 20:56:02 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:46.809 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.809 20:56:02 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:46.809 20:56:02 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:46.809 20:56:02 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:46.809 [2024-04-25 20:56:02.256971] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:46.809 [2024-04-25 20:56:02.257066] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205372 ] 00:08:46.809 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.810 [2024-04-25 20:56:02.292907] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:46.810 [2024-04-25 20:56:02.328657] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.810 [2024-04-25 20:56:02.364330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.068 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.068 INFO: Seed: 2616433798 00:08:47.068 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:47.068 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:47.068 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.068 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.068 #2 INITED exec/s: 0 rss: 62Mb 00:08:47.068 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.068 This may also happen if the target rejected all inputs we tried so far 00:08:47.068 [2024-04-25 20:56:02.599315] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:47.595 NEW_FUNC[1/636]: 0x4a5560 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:47.595 NEW_FUNC[2/636]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.595 #27 NEW cov: 10815 ft: 10785 corp: 2/33b lim: 32 exec/s: 0 rss: 68Mb L: 32/32 MS: 5 CopyPart-ChangeBit-InsertRepeatedBytes-ShuffleBytes-InsertByte- 00:08:47.595 #28 NEW cov: 10835 ft: 14119 corp: 3/65b lim: 32 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:47.855 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.855 #29 NEW cov: 10852 ft: 14845 corp: 4/97b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 CMP- DE: "\000\000\000V"- 00:08:47.855 [2024-04-25 20:56:03.516931] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 2025524839466146844 > max 8796093022208 00:08:47.855 [2024-04-25 20:56:03.516968] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x1c1c1c1c1c1c1c1c, 0x3838383838383838) offset=0x4a1c1c3b1c1c1c1c flags=0x3: No space left on device 00:08:47.855 [2024-04-25 20:56:03.516980] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:47.855 [2024-04-25 20:56:03.516999] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:47.856 [2024-04-25 20:56:03.517939] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x1c1c1c1c1c1c1c1c, 0x3838383838383838) flags=0: No such file or directory 00:08:47.856 [2024-04-25 20:56:03.517958] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:47.856 [2024-04-25 20:56:03.517978] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:48.114 NEW_FUNC[1/1]: 0x137e840 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3094 00:08:48.114 #30 NEW cov: 10865 ft: 15253 corp: 5/129b lim: 32 exec/s: 30 rss: 70Mb L: 32/32 MS: 1 PersAutoDict- DE: "\000\000\000V"- 00:08:48.373 #31 NEW cov: 10865 ft: 15853 corp: 6/161b lim: 32 exec/s: 31 rss: 70Mb L: 32/32 MS: 1 ChangeBit- 00:08:48.373 #35 NEW cov: 10865 ft: 15980 corp: 7/193b lim: 32 exec/s: 35 rss: 70Mb L: 32/32 MS: 4 EraseBytes-ShuffleBytes-InsertByte-CopyPart- 00:08:48.631 [2024-04-25 20:56:04.109516] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 2025524839466146844 > max 8796093022208 00:08:48.632 [2024-04-25 20:56:04.109539] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x560000001c1c1c1c, 0x721c1c1c38383838) offset=0x4a1c1c3b1c1c1c1c flags=0x3: No space left on device 00:08:48.632 [2024-04-25 20:56:04.109551] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:48.632 [2024-04-25 20:56:04.109582] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:48.632 [2024-04-25 20:56:04.110543] vfio_user.c:3104:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x560000001c1c1c1c, 0x721c1c1c38383838) flags=0: No such file or directory 00:08:48.632 [2024-04-25 20:56:04.110562] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:48.632 [2024-04-25 20:56:04.110578] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:48.632 #36 NEW cov: 10865 ft: 16123 corp: 8/225b lim: 32 exec/s: 36 rss: 70Mb L: 32/32 MS: 1 PersAutoDict- DE: "\000\000\000V"- 00:08:48.890 #42 NEW cov: 10872 ft: 16633 corp: 9/257b lim: 32 exec/s: 42 rss: 70Mb L: 32/32 MS: 1 CopyPart- 00:08:49.149 #43 NEW cov: 10872 ft: 16709 corp: 10/289b lim: 32 exec/s: 21 rss: 70Mb L: 32/32 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:49.149 #43 DONE cov: 10872 ft: 16709 corp: 10/289b lim: 32 exec/s: 21 rss: 70Mb 00:08:49.149 ###### Recommended dictionary. ###### 00:08:49.149 "\000\000\000V" # Uses: 2 00:08:49.149 "\000\000\000\000" # Uses: 0 00:08:49.149 ###### End of recommended dictionary. ###### 00:08:49.149 Done 43 runs in 2 second(s) 00:08:49.149 [2024-04-25 20:56:04.646183] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:49.408 20:56:04 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:49.408 20:56:04 -- ../common.sh@72 -- # (( i++ )) 00:08:49.408 20:56:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.408 20:56:04 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:49.408 20:56:04 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:49.408 20:56:04 -- vfio/run.sh@23 -- # local timen=1 00:08:49.408 20:56:04 -- vfio/run.sh@24 -- # local core=0x1 00:08:49.408 20:56:04 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.408 20:56:04 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:49.408 20:56:04 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:49.408 20:56:04 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:49.408 20:56:04 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:49.408 20:56:04 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:49.408 20:56:04 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:49.408 20:56:04 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.408 20:56:04 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:49.408 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.408 20:56:04 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:49.408 20:56:04 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:49.408 20:56:04 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:49.408 [2024-04-25 20:56:04.921875] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:49.408 [2024-04-25 20:56:04.921956] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205700 ] 00:08:49.408 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.408 [2024-04-25 20:56:04.956839] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:49.408 [2024-04-25 20:56:04.992622] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.408 [2024-04-25 20:56:05.028820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.668 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.668 INFO: Seed: 987473978 00:08:49.668 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:49.668 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:49.668 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.668 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.668 #2 INITED exec/s: 0 rss: 62Mb 00:08:49.668 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.668 This may also happen if the target rejected all inputs we tried so far 00:08:49.668 [2024-04-25 20:56:05.269242] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:49.668 [2024-04-25 20:56:05.321027] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.668 [2024-04-25 20:56:05.321062] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.187 NEW_FUNC[1/637]: 0x4a5f60 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:50.187 NEW_FUNC[2/637]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.187 #9 NEW cov: 10827 ft: 10789 corp: 2/14b lim: 13 exec/s: 0 rss: 69Mb L: 13/13 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:50.187 [2024-04-25 20:56:05.790364] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.187 [2024-04-25 20:56:05.790405] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.450 #15 NEW cov: 10844 ft: 13930 corp: 3/27b lim: 13 exec/s: 0 rss: 70Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:50.450 [2024-04-25 20:56:05.971899] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.450 [2024-04-25 20:56:05.971929] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.450 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.450 #16 NEW cov: 10861 ft: 15075 corp: 4/40b lim: 13 exec/s: 0 rss: 70Mb L: 13/13 MS: 1 CopyPart- 00:08:50.709 [2024-04-25 20:56:06.149742] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.709 [2024-04-25 20:56:06.149771] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.709 #17 NEW cov: 10861 ft: 15646 corp: 5/53b lim: 13 exec/s: 17 rss: 70Mb L: 13/13 MS: 1 CopyPart- 00:08:50.709 [2024-04-25 20:56:06.330326] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.709 [2024-04-25 20:56:06.330356] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.968 #18 NEW cov: 10861 ft: 16049 corp: 6/66b lim: 13 exec/s: 18 rss: 70Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:50.968 [2024-04-25 20:56:06.509888] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.968 [2024-04-25 20:56:06.509918] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.968 #24 NEW cov: 10861 ft: 16262 corp: 7/79b lim: 13 exec/s: 24 rss: 70Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:51.227 [2024-04-25 20:56:06.689066] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.227 [2024-04-25 20:56:06.689096] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.227 #25 NEW cov: 10861 ft: 16464 corp: 8/92b lim: 13 exec/s: 25 rss: 70Mb L: 13/13 MS: 1 ChangeBit- 00:08:51.227 [2024-04-25 20:56:06.866648] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.227 [2024-04-25 20:56:06.866677] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.513 #26 NEW cov: 10861 ft: 16555 corp: 9/105b lim: 13 exec/s: 26 rss: 70Mb L: 13/13 MS: 1 ChangeBit- 00:08:51.513 [2024-04-25 20:56:07.047319] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.513 [2024-04-25 20:56:07.047347] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.513 #27 NEW cov: 10868 ft: 16735 corp: 10/118b lim: 13 exec/s: 27 rss: 71Mb L: 13/13 MS: 1 ChangeBit- 00:08:51.827 [2024-04-25 20:56:07.225672] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.827 [2024-04-25 20:56:07.225702] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.827 #28 NEW cov: 10868 ft: 17458 corp: 11/131b lim: 13 exec/s: 14 rss: 71Mb L: 13/13 MS: 1 ChangeByte- 00:08:51.827 #28 DONE cov: 10868 ft: 17458 corp: 11/131b lim: 13 exec/s: 14 rss: 71Mb 00:08:51.827 Done 28 runs in 2 second(s) 00:08:51.827 [2024-04-25 20:56:07.349180] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:08:52.088 20:56:07 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:08:52.088 20:56:07 -- ../common.sh@72 -- # (( i++ )) 00:08:52.088 20:56:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.088 20:56:07 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:52.088 20:56:07 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:52.088 20:56:07 -- vfio/run.sh@23 -- # local timen=1 00:08:52.088 20:56:07 -- vfio/run.sh@24 -- # local core=0x1 00:08:52.088 20:56:07 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.088 20:56:07 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:52.088 20:56:07 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:52.088 20:56:07 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:52.088 20:56:07 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:52.088 20:56:07 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:52.088 20:56:07 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:52.088 20:56:07 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.088 20:56:07 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:52.089 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.089 20:56:07 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.089 20:56:07 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:52.089 20:56:07 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:52.089 [2024-04-25 20:56:07.629238] Starting SPDK v24.05-pre git sha1 06472fb6d / DPDK 24.07.0-rc0 initialization... 00:08:52.089 [2024-04-25 20:56:07.629309] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid206208 ] 00:08:52.089 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.089 [2024-04-25 20:56:07.664626] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.07.0-rc0 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:52.089 [2024-04-25 20:56:07.701091] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.089 [2024-04-25 20:56:07.736691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.348 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.348 INFO: Seed: 3690488596 00:08:52.348 INFO: Loaded 1 modules (345815 inline 8-bit counters): 345815 [0x27253cc, 0x2779aa3), 00:08:52.348 INFO: Loaded 1 PC tables (345815 PCs): 345815 [0x2779aa8,0x2cc0818), 00:08:52.348 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.348 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.348 #2 INITED exec/s: 0 rss: 63Mb 00:08:52.348 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.349 This may also happen if the target rejected all inputs we tried so far 00:08:52.349 [2024-04-25 20:56:07.966135] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:08:52.608 [2024-04-25 20:56:08.014052] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.608 [2024-04-25 20:56:08.014082] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.868 NEW_FUNC[1/633]: 0x4a6c50 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:52.868 NEW_FUNC[2/633]: 0x4a9190 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:52.868 #31 NEW cov: 10792 ft: 10791 corp: 2/10b lim: 9 exec/s: 0 rss: 68Mb L: 9/9 MS: 4 InsertRepeatedBytes-ChangeBinInt-InsertByte-CopyPart- 00:08:52.868 [2024-04-25 20:56:08.494796] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.868 [2024-04-25 20:56:08.494836] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.127 NEW_FUNC[1/3]: 0x13afc30 in handle_cmd_req /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5564 00:08:53.127 NEW_FUNC[2/3]: 0x13c3120 in spdk_nvme_opc_get_data_transfer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/nvme_spec.h:1782 00:08:53.127 #32 NEW cov: 10836 ft: 14022 corp: 3/19b lim: 9 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 CopyPart- 00:08:53.127 [2024-04-25 20:56:08.703933] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.127 [2024-04-25 20:56:08.703964] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.386 NEW_FUNC[1/1]: 0x19b3060 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.386 #33 NEW cov: 10853 ft: 15412 corp: 4/28b lim: 9 exec/s: 0 rss: 70Mb L: 9/9 MS: 1 ChangeByte- 00:08:53.386 [2024-04-25 20:56:08.903045] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.386 [2024-04-25 20:56:08.903074] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.386 #34 NEW cov: 10853 ft: 16122 corp: 5/37b lim: 9 exec/s: 34 rss: 70Mb L: 9/9 MS: 1 CrossOver- 00:08:53.645 [2024-04-25 20:56:09.099313] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.645 [2024-04-25 20:56:09.099342] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.645 #35 NEW cov: 10853 ft: 16258 corp: 6/46b lim: 9 exec/s: 35 rss: 70Mb L: 9/9 MS: 1 ChangeByte- 00:08:53.645 [2024-04-25 20:56:09.298195] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.645 [2024-04-25 20:56:09.298224] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.904 #36 NEW cov: 10853 ft: 16377 corp: 7/55b lim: 9 exec/s: 36 rss: 70Mb L: 9/9 MS: 1 CopyPart- 00:08:53.904 [2024-04-25 20:56:09.494047] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.904 [2024-04-25 20:56:09.494078] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.163 #37 NEW cov: 10853 ft: 16645 corp: 8/64b lim: 9 exec/s: 37 rss: 70Mb L: 9/9 MS: 1 CopyPart- 00:08:54.163 [2024-04-25 20:56:09.692212] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.163 [2024-04-25 20:56:09.692241] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.163 #38 NEW cov: 10860 ft: 17093 corp: 9/73b lim: 9 exec/s: 38 rss: 70Mb L: 9/9 MS: 1 ChangeBit- 00:08:54.422 [2024-04-25 20:56:09.891201] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.422 [2024-04-25 20:56:09.891231] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.422 #39 NEW cov: 10860 ft: 17145 corp: 10/82b lim: 9 exec/s: 19 rss: 70Mb L: 9/9 MS: 1 ChangeBit- 00:08:54.422 #39 DONE cov: 10860 ft: 17145 corp: 10/82b lim: 9 exec/s: 19 rss: 70Mb 00:08:54.422 Done 39 runs in 2 second(s) 00:08:54.422 [2024-04-25 20:56:10.029181] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:08:54.682 20:56:10 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:08:54.682 20:56:10 -- ../common.sh@72 -- # (( i++ )) 00:08:54.682 20:56:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.682 20:56:10 -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:08:54.682 00:08:54.682 real 0m19.166s 00:08:54.682 user 0m27.308s 00:08:54.682 sys 0m1.795s 00:08:54.682 20:56:10 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:54.682 20:56:10 -- common/autotest_common.sh@10 -- # set +x 00:08:54.682 ************************************ 00:08:54.682 END TEST vfio_fuzz 00:08:54.682 ************************************ 00:08:54.682 20:56:10 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:54.682 00:08:54.682 real 1m21.737s 00:08:54.682 user 2m6.435s 00:08:54.682 sys 0m8.737s 00:08:54.682 20:56:10 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:54.682 20:56:10 -- common/autotest_common.sh@10 -- # set +x 00:08:54.682 ************************************ 00:08:54.682 END TEST llvm_fuzz 00:08:54.682 ************************************ 00:08:54.941 20:56:10 -- spdk/autotest.sh@373 -- # [[ 0 -eq 1 ]] 00:08:54.941 20:56:10 -- spdk/autotest.sh@378 -- # trap - SIGINT SIGTERM EXIT 00:08:54.941 20:56:10 -- spdk/autotest.sh@380 -- # timing_enter post_cleanup 00:08:54.941 20:56:10 -- common/autotest_common.sh@710 -- # xtrace_disable 00:08:54.941 20:56:10 -- common/autotest_common.sh@10 -- # set +x 00:08:54.941 20:56:10 -- spdk/autotest.sh@381 -- # autotest_cleanup 00:08:54.941 20:56:10 -- common/autotest_common.sh@1378 -- # local autotest_es=0 00:08:54.941 20:56:10 -- common/autotest_common.sh@1379 -- # xtrace_disable 00:08:54.941 20:56:10 -- common/autotest_common.sh@10 -- # set +x 00:09:01.514 INFO: APP EXITING 00:09:01.514 INFO: killing all VMs 00:09:01.514 INFO: killing vhost app 00:09:01.514 INFO: EXIT DONE 00:09:04.805 Waiting for block devices as requested 00:09:04.805 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.805 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.065 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:05.065 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:05.065 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:05.324 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:05.324 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:05.324 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:05.584 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:05.584 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.584 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:08.873 Cleaning 00:09:08.873 Removing: /dev/shm/spdk_tgt_trace.pid171127 00:09:08.873 Removing: /var/run/dpdk/spdk_pid168626 00:09:08.873 Removing: /var/run/dpdk/spdk_pid169887 00:09:08.873 Removing: /var/run/dpdk/spdk_pid171127 00:09:08.873 Removing: /var/run/dpdk/spdk_pid171869 00:09:08.873 Removing: /var/run/dpdk/spdk_pid172844 00:09:08.873 Removing: /var/run/dpdk/spdk_pid172983 00:09:08.873 Removing: /var/run/dpdk/spdk_pid174107 00:09:08.873 Removing: /var/run/dpdk/spdk_pid174119 00:09:08.873 Removing: /var/run/dpdk/spdk_pid174546 00:09:08.873 Removing: /var/run/dpdk/spdk_pid174873 00:09:08.873 Removing: /var/run/dpdk/spdk_pid175200 00:09:08.873 Removing: /var/run/dpdk/spdk_pid175454 00:09:08.873 Removing: /var/run/dpdk/spdk_pid175652 00:09:08.873 Removing: /var/run/dpdk/spdk_pid175941 00:09:08.873 Removing: /var/run/dpdk/spdk_pid176228 00:09:08.873 Removing: /var/run/dpdk/spdk_pid176546 00:09:08.873 Removing: /var/run/dpdk/spdk_pid177406 00:09:08.873 Removing: /var/run/dpdk/spdk_pid180338 00:09:08.873 Removing: /var/run/dpdk/spdk_pid180710 00:09:08.873 Removing: /var/run/dpdk/spdk_pid180967 00:09:08.873 Removing: /var/run/dpdk/spdk_pid181139 00:09:08.873 Removing: /var/run/dpdk/spdk_pid181632 00:09:08.873 Removing: /var/run/dpdk/spdk_pid181764 00:09:08.873 Removing: /var/run/dpdk/spdk_pid182457 00:09:08.873 Removing: /var/run/dpdk/spdk_pid182465 00:09:08.873 Removing: /var/run/dpdk/spdk_pid182948 00:09:08.873 Removing: /var/run/dpdk/spdk_pid183017 00:09:08.873 Removing: /var/run/dpdk/spdk_pid183503 00:09:08.873 Removing: /var/run/dpdk/spdk_pid183522 00:09:08.873 Removing: /var/run/dpdk/spdk_pid184157 00:09:08.873 Removing: /var/run/dpdk/spdk_pid184443 00:09:08.873 Removing: /var/run/dpdk/spdk_pid184716 00:09:08.873 Removing: /var/run/dpdk/spdk_pid184821 00:09:08.873 Removing: /var/run/dpdk/spdk_pid185134 00:09:08.873 Removing: /var/run/dpdk/spdk_pid185173 00:09:08.873 Removing: /var/run/dpdk/spdk_pid185503 00:09:08.873 Removing: /var/run/dpdk/spdk_pid185802 00:09:08.873 Removing: /var/run/dpdk/spdk_pid186088 00:09:08.873 Removing: /var/run/dpdk/spdk_pid186386 00:09:08.873 Removing: /var/run/dpdk/spdk_pid186671 00:09:08.873 Removing: /var/run/dpdk/spdk_pid186965 00:09:08.873 Removing: /var/run/dpdk/spdk_pid187253 00:09:08.873 Removing: /var/run/dpdk/spdk_pid187548 00:09:08.873 Removing: /var/run/dpdk/spdk_pid187844 00:09:08.873 Removing: /var/run/dpdk/spdk_pid188118 00:09:08.873 Removing: /var/run/dpdk/spdk_pid188388 00:09:08.873 Removing: /var/run/dpdk/spdk_pid188644 00:09:08.873 Removing: /var/run/dpdk/spdk_pid188908 00:09:08.873 Removing: /var/run/dpdk/spdk_pid189179 00:09:08.873 Removing: /var/run/dpdk/spdk_pid189446 00:09:08.873 Removing: /var/run/dpdk/spdk_pid189708 00:09:08.873 Removing: /var/run/dpdk/spdk_pid190007 00:09:08.873 Removing: /var/run/dpdk/spdk_pid190282 00:09:08.873 Removing: /var/run/dpdk/spdk_pid190573 00:09:08.873 Removing: /var/run/dpdk/spdk_pid190854 00:09:08.873 Removing: /var/run/dpdk/spdk_pid191140 00:09:08.873 Removing: /var/run/dpdk/spdk_pid191432 00:09:08.873 Removing: /var/run/dpdk/spdk_pid191759 00:09:09.132 Removing: /var/run/dpdk/spdk_pid192277 00:09:09.132 Removing: /var/run/dpdk/spdk_pid192791 00:09:09.132 Removing: /var/run/dpdk/spdk_pid193093 00:09:09.132 Removing: /var/run/dpdk/spdk_pid193632 00:09:09.132 Removing: /var/run/dpdk/spdk_pid194011 00:09:09.132 Removing: /var/run/dpdk/spdk_pid194453 00:09:09.132 Removing: /var/run/dpdk/spdk_pid194981 00:09:09.132 Removing: /var/run/dpdk/spdk_pid195276 00:09:09.132 Removing: /var/run/dpdk/spdk_pid195805 00:09:09.132 Removing: /var/run/dpdk/spdk_pid196203 00:09:09.132 Removing: /var/run/dpdk/spdk_pid196622 00:09:09.132 Removing: /var/run/dpdk/spdk_pid197151 00:09:09.132 Removing: /var/run/dpdk/spdk_pid197452 00:09:09.132 Removing: /var/run/dpdk/spdk_pid197978 00:09:09.132 Removing: /var/run/dpdk/spdk_pid198467 00:09:09.132 Removing: /var/run/dpdk/spdk_pid198799 00:09:09.132 Removing: /var/run/dpdk/spdk_pid199329 00:09:09.132 Removing: /var/run/dpdk/spdk_pid199658 00:09:09.132 Removing: /var/run/dpdk/spdk_pid200148 00:09:09.132 Removing: /var/run/dpdk/spdk_pid200684 00:09:09.132 Removing: /var/run/dpdk/spdk_pid200970 00:09:09.133 Removing: /var/run/dpdk/spdk_pid201504 00:09:09.133 Removing: /var/run/dpdk/spdk_pid201921 00:09:09.133 Removing: /var/run/dpdk/spdk_pid202322 00:09:09.133 Removing: /var/run/dpdk/spdk_pid202861 00:09:09.133 Removing: /var/run/dpdk/spdk_pid203443 00:09:09.133 Removing: /var/run/dpdk/spdk_pid203766 00:09:09.133 Removing: /var/run/dpdk/spdk_pid204304 00:09:09.133 Removing: /var/run/dpdk/spdk_pid204839 00:09:09.133 Removing: /var/run/dpdk/spdk_pid205372 00:09:09.133 Removing: /var/run/dpdk/spdk_pid205700 00:09:09.133 Removing: /var/run/dpdk/spdk_pid206208 00:09:09.133 Clean 00:09:09.391 20:56:24 -- common/autotest_common.sh@1437 -- # return 0 00:09:09.391 20:56:24 -- spdk/autotest.sh@382 -- # timing_exit post_cleanup 00:09:09.391 20:56:24 -- common/autotest_common.sh@716 -- # xtrace_disable 00:09:09.391 20:56:24 -- common/autotest_common.sh@10 -- # set +x 00:09:09.391 20:56:24 -- spdk/autotest.sh@384 -- # timing_exit autotest 00:09:09.391 20:56:24 -- common/autotest_common.sh@716 -- # xtrace_disable 00:09:09.391 20:56:24 -- common/autotest_common.sh@10 -- # set +x 00:09:09.391 20:56:24 -- spdk/autotest.sh@385 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:09.391 20:56:24 -- spdk/autotest.sh@387 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:09.391 20:56:24 -- spdk/autotest.sh@387 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:09.391 20:56:24 -- spdk/autotest.sh@389 -- # hash lcov 00:09:09.391 20:56:24 -- spdk/autotest.sh@389 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:09.391 20:56:24 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:09.391 20:56:25 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:09.391 20:56:25 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:09.391 20:56:25 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:09.391 20:56:25 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.391 20:56:25 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.391 20:56:25 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.392 20:56:25 -- paths/export.sh@5 -- $ export PATH 00:09:09.392 20:56:25 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:09.392 20:56:25 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:09.392 20:56:25 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:09.392 20:56:25 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714071385.XXXXXX 00:09:09.392 20:56:25 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714071385.zWvNGg 00:09:09.392 20:56:25 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:09.392 20:56:25 -- common/autobuild_common.sh@441 -- $ '[' -n main ']' 00:09:09.392 20:56:25 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:09.392 20:56:25 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:09.392 20:56:25 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:09.392 20:56:25 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:09.392 20:56:25 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:09.392 20:56:25 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:09:09.392 20:56:25 -- common/autotest_common.sh@10 -- $ set +x 00:09:09.651 20:56:25 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:09.651 20:56:25 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:09:09.651 20:56:25 -- pm/common@17 -- $ local monitor 00:09:09.651 20:56:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:09.651 20:56:25 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=213256 00:09:09.651 20:56:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:09.651 20:56:25 -- pm/common@21 -- $ date +%s 00:09:09.651 20:56:25 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=213258 00:09:09.651 20:56:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:09.651 20:56:25 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=213261 00:09:09.651 20:56:25 -- pm/common@21 -- $ date +%s 00:09:09.651 20:56:25 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:09.651 20:56:25 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=213264 00:09:09.651 20:56:25 -- pm/common@21 -- $ date +%s 00:09:09.651 20:56:25 -- pm/common@26 -- $ sleep 1 00:09:09.651 20:56:25 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714071385 00:09:09.651 20:56:25 -- pm/common@21 -- $ date +%s 00:09:09.651 20:56:25 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714071385 00:09:09.651 20:56:25 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714071385 00:09:09.651 20:56:25 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714071385 00:09:09.651 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714071385_collect-vmstat.pm.log 00:09:09.651 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714071385_collect-cpu-load.pm.log 00:09:09.651 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714071385_collect-bmc-pm.bmc.pm.log 00:09:09.651 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714071385_collect-cpu-temp.pm.log 00:09:10.589 20:56:26 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:09:10.589 20:56:26 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:10.589 20:56:26 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:10.589 20:56:26 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:10.589 20:56:26 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:10.589 20:56:26 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:10.589 20:56:26 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:10.589 20:56:26 -- common/autotest_common.sh@722 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:10.589 20:56:26 -- common/autotest_common.sh@723 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:10.589 20:56:26 -- common/autotest_common.sh@725 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:10.589 20:56:26 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:10.589 20:56:26 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:09:10.589 20:56:26 -- pm/common@30 -- $ signal_monitor_resources TERM 00:09:10.589 20:56:26 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:09:10.589 20:56:26 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:10.589 20:56:26 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:09:10.589 20:56:26 -- pm/common@45 -- $ pid=213283 00:09:10.590 20:56:26 -- pm/common@52 -- $ sudo kill -TERM 213283 00:09:10.590 20:56:26 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:10.590 20:56:26 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:09:10.590 20:56:26 -- pm/common@45 -- $ pid=213287 00:09:10.590 20:56:26 -- pm/common@52 -- $ sudo kill -TERM 213287 00:09:10.590 20:56:26 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:10.590 20:56:26 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:09:10.590 20:56:26 -- pm/common@45 -- $ pid=213289 00:09:10.590 20:56:26 -- pm/common@52 -- $ sudo kill -TERM 213289 00:09:10.590 20:56:26 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:10.590 20:56:26 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:09:10.590 20:56:26 -- pm/common@45 -- $ pid=213288 00:09:10.590 20:56:26 -- pm/common@52 -- $ sudo kill -TERM 213288 00:09:10.848 + [[ -n 48560 ]] 00:09:10.848 + sudo kill 48560 00:09:10.860 [Pipeline] } 00:09:10.880 [Pipeline] // stage 00:09:10.887 [Pipeline] } 00:09:10.905 [Pipeline] // timeout 00:09:10.912 [Pipeline] } 00:09:10.927 [Pipeline] // catchError 00:09:10.933 [Pipeline] } 00:09:10.952 [Pipeline] // wrap 00:09:10.959 [Pipeline] } 00:09:10.975 [Pipeline] // catchError 00:09:10.984 [Pipeline] stage 00:09:10.986 [Pipeline] { (Epilogue) 00:09:11.002 [Pipeline] catchError 00:09:11.004 [Pipeline] { 00:09:11.020 [Pipeline] echo 00:09:11.022 Cleanup processes 00:09:11.028 [Pipeline] sh 00:09:11.312 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:11.312 124325 sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714071051 00:09:11.312 124357 bash /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714071051 00:09:11.312 213438 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:09:11.312 214348 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:11.327 [Pipeline] sh 00:09:11.611 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:11.611 ++ grep -v 'sudo pgrep' 00:09:11.611 ++ awk '{print $1}' 00:09:11.611 + sudo kill -9 124325 124357 00:09:11.622 [Pipeline] sh 00:09:11.906 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:11.906 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:11.906 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:13.298 [Pipeline] sh 00:09:13.585 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:13.585 Artifacts sizes are good 00:09:13.600 [Pipeline] archiveArtifacts 00:09:13.606 Archiving artifacts 00:09:13.664 [Pipeline] sh 00:09:13.948 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:13.963 [Pipeline] cleanWs 00:09:13.974 [WS-CLEANUP] Deleting project workspace... 00:09:13.974 [WS-CLEANUP] Deferred wipeout is used... 00:09:13.980 [WS-CLEANUP] done 00:09:13.982 [Pipeline] } 00:09:14.003 [Pipeline] // catchError 00:09:14.017 [Pipeline] sh 00:09:14.349 + logger -p user.info -t JENKINS-CI 00:09:14.359 [Pipeline] } 00:09:14.376 [Pipeline] // stage 00:09:14.382 [Pipeline] } 00:09:14.401 [Pipeline] // node 00:09:14.407 [Pipeline] End of Pipeline 00:09:14.444 Finished: SUCCESS